Comments Locked

49 Comments

Back to Article

  • andy o - Monday, February 7, 2011 - link

    Thanks for the updates, first of all.

    Black level, instead of brightness, is more indicative of contrast ratio in real world uses. Because to achieve the max, you have to bring brightness to the max, and on an LCD it can be blinding. I have a pro NEC monitor for photos, and it reaches like 390:1 at 110 cd/m2, which is about what a calibrated monitor should be. It can go up to about 800 at max brightness, but it's useless at that setting.

    That said, black level quality also varies with different types of LCD. IPS-based usually gets higher black level, but dark color tracking is much better. PVA screens suck when you look at them straight on. See <a href="http://forums.dpreview.com/forums/read.asp?forum=1... to see what I mean. Most laptops' have cheap TN panels though. I think these qualities should be considered in monitor reviews as well, just a thought.
  • andy o - Monday, February 7, 2011 - link

    So no html...

    Anyway, my point was that the Asus laptop, will likely reach a higher contrast ratio at regular, usable brightnesses. If the Envy's panel is IPS though, I'd choose much lower contrast ratios in order to have better dark colors and consistent colors.
  • JarredWalton - Monday, February 7, 2011 - link

    Most of the laptops and displays I've tested have been generally consistent in contrast ratio, so if you get 1000:1 at maximum brightness, dropping to 100 nits will still give close to 1000:1 -- it might be 900:1 or it might even increase to 1100:1, but that's not enough to really make a difference. I usually feel like you need at least a 25% change in contrast before you really notice it with the naked eye.

    As far as IPS panels and laptops are concerned, the only IPS option I'm aware of right now is the upgraded HP EliteBook 8740w LCD, which costs I think $550 or so. Ouch!
  • softdrinkviking - Tuesday, February 8, 2011 - link

    the 15" HP Elitebook also lets you choose a dreamcolor HD display as well (which I think is what indicates IPS) $425 upgrade. still high, but where else can you get a good, non-apple laptop display?
  • Luke2.0 - Monday, February 7, 2011 - link

    1. Nice opening image of broken chip...

    2. I was looking forward to a review of Asus N53SV (or SN). Is it among those delayed / canceled ones?

    3. (personal rant) I might start tinkering on Ivy Bridge now...
  • JarredWalton - Monday, February 7, 2011 - link

    Everything with Sandy Bridge is at least delayed right now, including the N53, G53, and G73 updates. I hadn't received any of the ASUS models yet, but I was expecting them to arrive last week. Then Intel drops that bomb and everything SNB related disappears. :-(
  • MrSpadge - Tuesday, February 8, 2011 - link

    For regular notebooks they should just use the 2 SATA3 ports and be done with it.

    MrS
  • DanNeely - Tuesday, February 8, 2011 - link

    Many laptops have eSATA ports so they need fixed. Beyond that, even if the boards aren't using the faulty ports you can be certain that some bottom feeding class action lawyer would end up suing over every dead port if they use the faulty chipset.
  • vikingrinn - Monday, February 7, 2011 - link

    @Jarred Walton or Vivek Gowri

    Since you compared it with the G73Jw, did the "One such notebook came with a “low-end” i7-2630QM processor and a GTX 460M GPU, packed into a 15.6” chassis" just so happen to have a 17.3" display with backlit keyboard? ;)
  • BWMerlin - Monday, February 7, 2011 - link

    @vikingrinn How can it have a 17.3" display when the chassis is only 15.6"?

    My bet is either the ASUS G53 or the MSI equivalent.
  • vikingrinn - Tuesday, February 8, 2011 - link

    @BWMerlin You might be right, but 17.3" display in 15.6" size chassis not entirely implausible (although not sure if they slimmed down the chassis of the G73 for the G73SW release?), as the M17x R3 had been slimmed to almost the same size chassis as the M15x and also had 900p as a display option.
  • JarredWalton - Tuesday, February 8, 2011 - link

    Note that I updated the article. MSI said I could pass along the fact that the testing was done with their GT680R. It's certainly fast enough for gaming, though there are some areas that could be improved (unless you like glossy plastic). Now we wait for PM67 version 1.01....
  • vikingrinn - Tuesday, February 8, 2011 - link

    @JarredWalton Thanks for the update - looking forward to a review of both the M17x R3 and G73SW soon then! ;)
  • stmok - Monday, February 7, 2011 - link

    "What we know of Llano is that it will combine a K10.5 type CPU architecture with a midrange DX11 GPU (something like the HD 5650), integrated into a single chip."

    Firstly, AMD's Llano will be marketed as its "A-series" APU line. (Where G-series, E-series and C-series belong to their Bobcat-based lines.)

    Llano is a modified version of the Athlon II series with Radeon HD 5550 GPU as its IGP. The APU will feature Turbo Core 2.0 Technology (power gating, etc). It will use DDR3-1600 memory.

    Llano's x86 cores are codenamed "Husky".

    The IGP in Llano has two versions:
    One is codenamed "Winterpark" => Only in dual-core versions of APU.
    One is codenamed "Beavercreek". => Only in triple and quad-core versions of APU.

    For TDP spec, there will be two distinct lines for the desktop version of Llano.
    => 65W (dual-cores and low power quad-cores) and 100W (triple and quad-cores).

    As well the solution will allow for Hybrid-Crossfire configuration.
    => Llano IGP + Radeon HD 6570 or HD 6670 video cards.

    Performance wise...(According to AMD's presentation I saw.)

    Dual-core Llano
    => Overall, lags slightly behind Athlon II X2 250 (3.0Ghz) and Pentium E6500 (2.93Ghz)

    Quad-core Llano
    => Its slightly slower than a current Athlon II X4 630 with Radeon HD 5550 discrete video card.

    So in the end...

    Sandy Bridge => Far better CPU side. Not as good with IGP.
    Llano => Far better IGP. Not as good on CPU side.

    If you want an APU that will be revolutionary, its best if you wait for "Trinity" in 2012.
  • Taft12 - Monday, February 7, 2011 - link

    This is great detail, more than I have ever seen about Llano before now (and thanks a bunch for it!)

    Is this from publically available AMD documentation? You said this was from a presentation you saw...
  • Kiijibari - Monday, February 7, 2011 - link

    First, you wrote APU, even though there is no Bulldozer APU, yet. Zambezi and Interlagos/Valencia are normal CPUs. You correctly mentioned Trinity later, which is an APU, but that is already Bulldozer v2.0, and it is far away due in 2012.

    Second, you stated that cache-sizes are unkonwn - they are not:
    See AMD's blog, link removed due to SPAM detection bot.

    Third you speculate about a launch similar to the K8's in 2003, however; it is already know that desktop parts will launch *prior* to server parts in Q2:
    <Link removed due to SPAM detection, just read the analyst day slides again>
  • JarredWalton - Monday, February 7, 2011 - link

    I've corrected some of the text to clarify the meaning. Orochi is the eight-core design, with "Zambezi" for desktops and "Velencia" destined for servers. AFAICT, it's the same chip with different packages depending on the market (and I'd guess AMD is using the extra time between desktop and servers to do extra validation). Zambezi is also apparently a name for the desktop platform in general, unless the "four core and six core Zambezi" won't get a separate name.

    Given the purported size of the Orochi core, I can see four-core and six-core being harvested die, but they're still going to be huge. Right now, it appears the eight-core will have 16MB total L2 cache (2MB per core!) and an additional 8MB L3 cache. Long-term, the four-core and six-core should get separate designs so they don't have to be quite so large. Those are the chips that I expect won't be out for desktops until Q3/Q4.
  • Cow86 - Tuesday, February 8, 2011 - link

    Sorry there Jarred, first time poster, long time reader, but I hád to correct you on this :P Two things are wrong in what you say:

    1) The 8 core, 4 module bulldozer chip will have 8 MB of L2 cache (2 MB shared per MODULE, not core), and 8 MB L3 cache. This has been confirmed by Fruehe in discussions plenty of times, and you'll find it all over the web.

    2) Whilst you can indeed expect the 6-core to be harvested (as it will also keep the 8 MB of L3 cache) it is rather clear the 4-core will be separate, like the dualcore athlon II is now as well. The clue to this is the fact that the 4 core chip will only have 4 MB of L3 cache.

    http://www.techpowerup.com/134739/AMD-Zambezi-Bull...

    Look at the roadmap :)
  • JarredWalton - Wednesday, February 9, 2011 - link

    Oh, I guess I read the "2MB per module" wrong -- thought they had said 2MB per core. Somewhere else said 16MB cache, and that then made sense, but if it's 16MB cache total that also works. Anyway, long-term it would be potentially useful to have separate die for 3-module and 2-module as well as the standard 4-module, because even the 6-core is still going to have 2MB cache and 2 cores disabled. However, the time to do such a redesign might make it too costly, so maybe not. There's nothing to prevent AMD from disabling part of the L3 cache as well as the cores for a 4-core version though -- we've already seen Athlon X2 that were harvested Phenom X4 for instance. That's definitely not something you want to do a lot if you can avoid it, obviously.
  • DanNeely - Monday, February 7, 2011 - link

    "There’s actually a lot more work involved in moving a Redwood GPU architecture to 32nm, as most of the Intellectual Property (IP) related to GPUs targets the so-called half-nodes (55nm, 40m, and in the future 28nm). It’s one reason we expect AMD to eventually move all of their CPU and GPU production to such nodes, but that's a ways off and Llano will use the same process size as Intel’s current CPUs."

    What's actually different between the two? I assumed it was just a case of what they picked as the next scaling point. There've been a number of GPUs in the past that have dropped from half to full to half node again as each one became widely available. I'd've assumed the main engineering challenge would be optimizing for the quirks in GF's processes instead of TSMC's.
  • JarredWalton - Monday, February 7, 2011 - link

    There's a lot of licensed technology in most GPUs, and most of that exists on the half-nodes right now. Back in the 90nm and 65nm days it didn't really matter, but when TSMC went to 55nm and then 40nm a lot of the companies doing design work on various modules went that route rather than sticking with the CPU nodes. So it's not just a quick and dirty process shrink, but the end result could be very interesting.
  • DanNeely - Monday, February 7, 2011 - link

    That didn't answer my question about what is different between half and full nodes that makes it more than just a process shrink?
  • JarredWalton - Monday, February 7, 2011 - link

    Sorry... AFAIK, nothing is different, other than extra work involved porting IP from 40nm (ATI's current target) to 32nm.
  • DanNeely - Tuesday, February 8, 2011 - link

    In that case, why are you expecting amd to move everything to half node processes?
  • JarredWalton - Tuesday, February 8, 2011 - link

    Because when everything else moves to 28nm, AMD would have their IP on 32nm; then next will be 20nm and 22nm. In my talks with AMD and GlobalFoundries at CES, they didn't outright state that they would move over, but right now the only ones really doing things on the "full nodes" are AMD and Intel. If you want to get in on the smartphone and tablet stuff -- or other SoC designs -- it makes it far easier to be able to license chunks of the design from others.
  • Soleron2 - Monday, February 7, 2011 - link

    "Anand guessed at a Q3/Q4 2011 launch for desktop Bulldozer, which means Bulldozer might not join the mobile party until Q4’11 or perhaps even 2012."

    Desktop Bulldozer is Q2 '11 according to AMD, officially. John Fruehe has confirmed this multiple times. Server Bulldozer is Q3 '11.
  • JarredWalton - Monday, February 7, 2011 - link

    I clarified the text... high-end desktop will be first, but it's basically the server chip. I think the "mainstream" desktop stuff will come later, so basically we're getting Athlon FX equivalent first, then Opteron, and then regular Athlon (to draw parallels with the K8 rollout).
  • icrf - Monday, February 7, 2011 - link

    "multithreaded tasks like video encoding and 3D rendering generally need more floating-point performance"

    My understanding is video encoding is very integer intensive, or at least any DCT-based ones. I'm told x264 spends most of its time in integer SIMD, so I'm not sure standard integer cores matter much, as the vector hardware is where everything is happening.
  • JarredWalton - Monday, February 7, 2011 - link

    I believe video encoding apps have been optimized to use a lot of SSE code, which means even if they're doing INT work in SSE, it still uses the FP/SSE registers. Anyway, without hardware we really just can't say for sure how Bulldozer will perform -- or what sort of power it will require. I'm guessing it will be competitive with Sandy Bridge on some things, faster in pure INT workloads, and slower in FP/SSE. But for mobility, I think it might use a lot more power than most notebooks can provide. We'll see in a few months.
  • SteelCity1981 - Monday, February 7, 2011 - link

    Clock speed also makes a diff seeing has the i7-2630QM is 270mhz faster than the i7-720QM.

    Anantech should underclock a i7-2630QM to match a i7-720QM clock speed in one of its test to see how much faster the i7-2630QM is clock for clock.
  • Stuka87 - Monday, February 7, 2011 - link

    I keep having to wait longer and longer to get a notebook. But I don't want to buy a previous generation machine :/

    Thanks for the update though :)
  • ajp_anton - Monday, February 7, 2011 - link

    "and multithreaded tasks like video encoding and 3D rendering generally need more floating-point performance."

    The developers of x264 say that it (don't know about other encoders) uses pretty much only integer math.
  • JarredWalton - Monday, February 7, 2011 - link

    See above: do they do integer math using MMX/SSE, or do they do it with regular integer instructions? At least one post I read from an x264 developer (http://x264dev.multimedia.cx/archives/51) makes me think they're using SSE/MMX extensions, which would use the FP registers if I'm not mistaken. "Emulating these float ops with complex series of integer ops is far too slow to be worthwhile, so unfortunately we cannot fully abide by Intel’s advisories." If anyone can confirm how much pure integer vs. MMX/SSE video encoding uses, I'm all ears.

    x264 is only one implementation, so we also have H.264 in general, WME, DivX, etc. we could discuss. Given how the GPU people are leveraging their DX10/11 cores to accelerate encoding/transcoding, and GPUs are considered "FP monsters" (even if they're working on INTs in FP registers), again it makes me think the dual-integer core design of Bulldozer might not be ideal for video encoding. We'll find out for sure in the next few months when the CPUs hit retails, of course, so all I can do right now is speculate.
  • ajp_anton - Tuesday, February 8, 2011 - link

    I don't know enough to give a good answer, but I think it has something to do with the fact that you can do multiple 8-bit ints instead of one single 32-bit that gives x264 so much speed. No (CPU) "FP monster" can make up for this advantage, as they can only do 32-bit floats.
    You could convert x264 to use floats instead of ints everywhere and get the same result, but you would lose a lot of speed.
    Or something... =)
  • LostPassword - Monday, February 7, 2011 - link

    Sad to hear about HP LCD. Main reason I considered an envy
  • jonup - Monday, February 7, 2011 - link

    Why are all Sandy Bridge laptops recalled if the two SATA3 ports are not affected by the recall? Most notebooks utilize only one HDD/SSD and an optical drive. Were most laptops built around the SATA2 controller, or is it that the SATA3 controller is affected by the recall but not as frequently as the SATA2 one? Is the holdup mandated by Intel regardless of which controller is used? Can you elaborate on this? It will be greatly appreciated.
    Thank you!
  • JarredWalton - Monday, February 7, 2011 - link

    eSATA, and I think a lot of laptops may have simply used the SATA 3.0Gbps ports even though 6.0Gbps ports were available.
  • jonup - Monday, February 7, 2011 - link

    Thanks!
  • jonup - Tuesday, February 8, 2011 - link

    I just read that Intel allowed partners to ship devices if the partners guarantee that they are not using the faulty controller. So we should start seeing some Sandy Bridge lappy's soon.

    Disclaimer: Intel's announcement came after your response yesterday.
  • JarredWalton - Tuesday, February 8, 2011 - link

    Yeah, I would think some of the budget stuff (i.e. Acer where they only have one HDD, one DVD/BRD, and no eSATA) could go out. Hope to see it soon!
  • concernedsophist - Monday, February 7, 2011 - link

    This 10 inch Ontario netbook has a 720p lcd.
  • JarredWalton - Tuesday, February 8, 2011 - link

    Does it really? It says "internal max resolution" at the linked site, and it also says "XGA" (1024x768). I would be surprised it they had an actual 1280x720 display, but I'll find out soon enough -- the laptop should arrive next week.
  • concernedsophist - Wednesday, February 9, 2011 - link

    Yes, I am typing on one right now. Screen res running at 1280x720. The screen looks pretty good, having subjectively compared it to some friends' older netbooks. Running pretty snappy in comparison too.
  • concernedsophist - Wednesday, February 9, 2011 - link

    Upgraded to 2Gb ram. I heard some rumors that the hardware will recognize 4Gb but OS needs to be upgraded to use more then 2.
  • Hrel - Monday, February 7, 2011 - link

    There's a 15.6" Clevo based on Sandy Bridge with a GTX460M in it that I found on Xoticpc.com and cyberpowerpc.com that I was told by the staff at both websites supported Optimus. AvaDirect also had it but for a much higher price. I think Xotic is the only one that still has any.

    It had a solid looking chassis and a very large battery. I'd love to see a review of that unit.
  • ntsan - Tuesday, February 8, 2011 - link

    Acer 522 is 1280X720 P Resolution
    http://www.amazon.com/gp/product/B004GILTB6?ie=UTF...

    Lots of buyer said the screen is nice, you say they are lying?
  • JarredWalton - Tuesday, February 8, 2011 - link

    I corrected the text, but saying "1280x720 is better than 1024x600" doesn't tell us much about the display quality. I don't expect much in terms of contrast, but that's nothing new. I'm just surprised to see 720p as a 10.1" resolution -- I'm a little skeptical that it's really 720p and not 768p, but as stated above, I'll find out soon enough.
  • flashbacck - Thursday, February 10, 2011 - link

    *sigh* why is it so hard for PC manufacturers to come up with a nicely designed laptop? Say what you will about the Apple "ecosystem," they sure know how to design nice hardware. I just wish people on the PC side were capable of doing the same.
  • thrylos - Tuesday, February 7, 2012 - link

    Those are my laptop's characteristics :

    Intel(R) Core(TM) i7 CPU Q 740 @ 1.73GHz , 1730Mhz, 4 core(s)
    Installed Memory (RAM): 8GB
    Graphics Adapter: NVIDIA GeForce GT 425M 2GB
    Display: 14.0 inch, 16:9, 1366x768 pixels

    My xps runs hot, in high temperatures.
    Its getting hotter and hotter with the time, on both sides (front and back) so every program chrashes. Does anybody know how can i fix it?

Log in

Don't have an account? Sign up now