You guys really need to get the hardware for measuring input lag. For most displays this would not be a big deal to not know, since most are reasonable for basic activities. But when reviewing a gaming display, this is a pretty critical thing to know.
I completely agree, input lag is very important. I still use my ZR30w IPS display which does not have a scaler or OSD, and the input lag is much lower than the Dell U3011.
Honestly. It's months overdue. It's arguably one of the most important things to !measure on any display, let alone ones with gamer oriented features (cutting edge adaptive sync technologies).
Darn almost peed my pants then saw it was only 1080. Was kinda surprised when i saw DP 1.2a and HDMI 1.3 but that explained it. Give us 1440 curved with freesync and its probably pinnacle for easily 5 years till OLEDs take over.
I'd love to pick up a G-Sync or Freesync version of the BenQ BL3200PT if they could pull off even a slightly wider range of refresh rates than we see here in this review. 40Hz to 80Hz would allow them to double strobe every frame at 40Hz or below.
Interestingly, what I take away from this review, is AMD screwed up one of the benefits of G-sync, that is, the range of refresh. Even though this monitor is capped at 75hz, ie 75FPS, even with vsync off it should know better than to drop below 60FPS/60HZ because tearing is more annoying than judder at lower FPS because the tearing stays on the screen longer since the refresh rate is lower. That's a real oversight...
Personally I just stick with 144hz panels (all of which are TN unfortunately) and always have vsync on. With a powerful enough card that never drops below 60+fps it is butter smooth and looks smooth as butter.
It's not an AMD display. It's LG. AMD only makes the cards that can do dynamic refresh rates with the DP1.2a standard. Whatever the specs or features are for the panel are at the monitor manufacturers discretion.
that's one advantage of G-Sync: more control on monitor features since they can just refuse to license the board and it protects the brand by giving it a premium feel (putting the horrible RMA rates I heard about the swift aside), and people fall for it.
I agree. This implementation of adaptive sync is pretty bad. Frame rates in games like Battlefield 4 are often above 90 while playing and then can drop into the 50s at different points. With most games the frame rates are all over the place based on GPU demand and don't fit into this 27Hz/27fps range.
As mobutu said Acer currently has a 144Hz IPS G-Sync monitor on the market, and to top it all off it's 27" 1440p. Linus recently did a great video review of the XB270HU: https://www.youtube.com/watch?v=_LTHr96NueA
At just 1080p feels like a waste to go there. Even 1440p doesn't feel like enough if you are gonna invest this much in a new screen. Anyway, it might be wise to include screen dimensions and screen area in the specs. With such huge differences in AR, the diagonal is misleading. Hell, i would even chart price per square cm but i don't expect you to do that. A 34 inch 16:10 screen would be almost 25% bigger than this one, or a 30inch 16;10 is almost the same area. In a better world regulators would force retailers to properly display screen dimensions, 99.9% of consumers don't realize the differences in size that come with AR.
3440x1440 is really worth it because it's the best resolution you can get for working without messing around with scaling (and frankly it's better than the ones you'd want to use with scaling in most regards for functionality)
At 650$ it's a 5 years or more purchase not 1-2 years. And that's mostly 4k territory. Buying a 27inch 1080p IPS at 200$ on an easy to find deal is ok for 1-2 years of usage but this is a lot more. 1440p is much better than 1080p ofc but it also costs a lot more and you end up in a similar situation.
For working purposes I would not consider a 16:x 4k an upgrade from 3440x1440 at all. I would be trading sufficient x space to have a third item up or a wide item and a narrow one up at the same time in return for a small amount of y space that doesn't make a meaningful difference. Past roughly 1200 pixels tall, 21:9 is by far the best aspect ratio for work. By 1200 pixels, there's plenty of y space that added information by increasing y space is facing seriously diminishing returns, while x space is starting to go from two pretty wide windows to three windows side by side, which is still giving significant returns.
Of the current selection of monitors, I would definitely choose 3440x1440 to keep for 5 years, and spending that much tends to come with a very nice, calibrated screen. A $300-$400 2560x1440 isn't the same quality screen.
Theoretically even if you drop below 48hz it shouldn't be all bad. Between 16 and 24 fps, you can just triple the refresh rate, 48-75hz would work just fine without tearing. Between 24 and 37 fps, you double the refresh rate, so no problem either.
You would only have a problem between 37 to 48 FPS, which is unfortunate.
But AMD isn't doing that, and the VRR window is too small to do window shifting. If you want to display every frame on time you need a max frame interval needs to be greater than your frametime variance plus double the minimum frame interval.
You can test input lag with inexpensive hardware, for example, an arduino with native USB that emulates a mouse input and measures a subsequent brightness change with a photoresistor.
This gave me an idea, a Cypress PSoC board instead of Arduino could also work and maybe you could make a similar device to test touch responsiveness in phones and tabs. Cypress makes touch controllers so maybe they would help you out with some coding to enable you to test touch responsiveness. You could at least try. Guess Arduino started with Atmel chips and Atmel is also one of the major touch controller players so you could try to ask for their help too.
It's worth mentioning that this wouldn't be good test methodology. Youd be at the mercy of how windows is feeling that day. To test monitor input lag you need to know how long it takes between when a pixel is sent across displayport or whatever to when it is updated on the display. It can be done without "fancy hardware" with a CRT and a high speed camera. Outside of that you'll need to be handling gigabit signals.
Actually it can still be done with inexpensive hardware. I don't have a lot of experience with how low level you can get on the display drivers. Uou would need to find one that has the video transmission specs you want and you could dig into the driver to give out debug times when a frame started being sent (I could be making this unnecessarily complicated in my head, there may be easiest ways to do it). Then you could do a black and white test pattern with a photodiode to get the response time + input lag then some other test patterns to try to work out each of the two components (you'd need to know something about pixel decay and things I'm not an expert on).
All of the embedded systems I know of are vga or hdmi though...
You can still calibrate with a CRT, but you can get thousands of times more samples than with a high speed camera(with the same amount of effort). USB polling variance is very easy to account for with this much data, so you can pretty easily get ~100 microsecond resolution.
100 microsecond resolution is definitely good enough for monitor input lag testing. I won't believe you can get that by putting mouse input into a black box until I see it. It's not just windows. There's a whole lot of things between the mouse and the screen. anandtech did a decent article on it a few years back.
Games are complicated, but you can make a test program as simple as you want, all you really need to do is go from dark to light when you give an input. And the microcontroller is measuring the timestamps at both ends of the chain, so if there's an inconsistency you haven't accounted for, you'll notice it.
If Windows adds unpredictable delays, all you need to do is take enough samples and trials and compare averages. That's a cool thing about probability.
Having proprietary standards in pc gaming accessories is extremely frustrating. I switch between AMD and nVidia every other generation or so and I would hate for my monitor to be "downgraded" because I bought the wrong graphics card. I guess the only solution here is to pray for nVidia to support Adaptive-Sync so that we can all focus on one standard.
I assume you didn't encounter supposed horrible backlight bleed that people seem to complain about on forums. That (and the currently proprietary nature of freesync until intel or nvidia supports it) is preventing me from buying this monitor.
Isn't FreeSync disabled when over the VRR? Your in game (F1) ghosting test is with FreeSync disabled. What is the ghosting like with FreeSync active? I understand overdrive is forced off whenever FreeSync is active.
Behavior over the max VRR depends on the game setting -- it can be either VSYNC on (which will effectively cap FPS at max VRR) or VSYNC off. I personally leave it off, as I like being able to run at higher FPS.
I took plenty of ghosting pictures to check this out as much as possible. From what I can see, overdrive ("Response Time" in the LG OSD) is fully active with or without FreeSync.
Sadly I don't have any equipment suitable for testing the actual refresh rate, which is why I say "claims". Right now, I'm pretty sure that if you fall below 48 FPS the display refreshes at 48Hz. I suspect it's something AMD can change with driver updates. We're still waiting of CrossFire FreeSync as well, so maybe the next driver update will alter the way this works.
This is not a good variable refresh rate display, plain and simple. In fact I would argue it's not a good desktop monitor in general. I think the fact that this monitor even came to market illustrates the difference between Nvidia and AMD's strategy and design philosophy.
Windows open standards up to the computer baker to deliver good hardware, and up to all the hardware manufacturers to deliver good drivers. Apple computers do have a very nice benefit, although I feel still more open then nvidia at times. Although proprietary and closed isn't necessarily helping apple computers in the gaming world so controlling everything isn't always good.
14ms *full* response time. 5ms GTG, which is standard. Not the fastest, for sure, but plenty fast enough for gaming. The only people who won't be satisfied by this response time are CS:GO addicts.
In the article, it was said that 34" ultra wide screen is enormous - bigger than most people are used to. Well it might be at first glance, but you get used to it within a day or two. However lets see what a 34" monitor really is?
The 34" monitor with 21:9 aspect ratio is nothing more than standard 27" display, 16:9 aspect and 2560x1440 resolution with extra 440 pixels added to both sides to extend the width to get 21:9 aspect ratio.
Therefore if you are used to for example 30" 16:9 display then going to 34" 21:9 is going backwards to smaller display.
I personally upgraded from 23" 16:9 display to 34" 21:9 curved ultra wide (Dell U3415W) and am very satisfied.
Anyhow the 34" display is not enormous, just a bit bigger - if you are used to 27" then it is athe same but a bit wider, if you are used to 30" then it is already smaller display.
"The 34" monitor with 21:9 aspect ratio is nothing more than standard 27" display, 16:9 aspect and 2560x1440 resolution with extra 440 pixels added to both sides to extend the width to get 21:9 aspect ratio."
Except this monitor doesn't have a 3440x1440 panel, it's 2560x1080...
"Therefore if you are used to for example 30" 16:9 display then going to 34" 21:9 is going backwards to smaller display."
What do you mean by smaller? He stated in the conclusion that it's significantly wider than his old 30" monitor.
Exactly. The LG 34UM67 measures 32.5" wide; my old 30" WQXGA measures 27.5" wide. On most desks, a five inch difference is quite noticeable, and going from a 27" display that was 25.5" wide makes it even more noticeable. Is it bad to be this big, though? Well, not if you have the desk space. I still want the 3440x1440 resolution of the 34UM95, though.
Well all I can say, that for many, the vertical space is more important than horizontal space. HMany refuse to go from 1200 to 1080 vertical pixels regardless of the width. So if converting to ultra wide screen, watch out for the vertical dimensions, both physical as pixel count and make sure that you are willing to make the necessary compromises.
Also regarding the reference to 27" monitor, I again took my own Dell as an example (for some reason assumed that all LG panels are also 1440 px high). However as the height is 1080 then it is compareable to 1920x1080 display that is streched to 2560 pixels (320 pixels added to both sides)
I love LG IPS panels. My television is an LG IPS type. They are among the best I've seen. Color accuracy is not something I consider important while gaming so I don't get hung up on this aspect. Also, viewing angles are important if you don't sit directly in front of the monitor when you game, but who in the heck doesn't sit dead center while they game? Viewing angle is another of the less important aspects of a gaming monitor.
This monitor offers far better contrast than any G-sync monitor so far and contrast is really important when your enemy is camped out in the shadows in a multi-player FPS and should absolutely be considered when looking for a gaming monitor. I also like the resolution/aspect ratio of this monitor for gaming.
Three things that would keep me from buying this monitor:
1) Can't go below 48 fps or above 75 fps without introducing tearing. Games like Crysis 3 or those games using TressFX like Tomb Raider most definitely bring framerates below the 48Hz/48 fps horizon with details and AA cranked for a 290x or GTX 980. Check multiple benchmarks around the web and you'll see what I mean. Why bother? You have a range of 27fps (48Hz-75Hz) in which your games have to run in order to get any free sync advantage.
2) AMD stated there wouldn't be a price premium, yet there is. All the hype prior had every AMD rep saying there is no added cost to implement this technology, yet there really is because there is a change to the production process. Apparently, many manufacturers have not bought into the adaptive sync "standard" yet.
3) The color gamut on the ROG Swift is slightly better than this IPS monitor. I stated color accuracy is not that important to me, but if I'm buying an IPS monitor, it better provide better color accuracy than a TN.
Also, input lag is a measurable aspect. Not sure why this was essentially left out of the review.
Ya i agree that the 27Hz range is dumb, it needs a wider range and freeSync can support a much wider range. FreeSync actually has a far wider range than G-Sync so when a monitor comes out that can take advantage of it it will probably be awesome. I'm sure the added cost premium is the manufacturer trying to make a few bucks off of something "new" not really AMDs fault but nothing they or anyone can do about it except LG. Also might cost more because you get a different scaler that might be higher quality than another who knows.
This is a poor implementation with that limited frequency range.
I find the best part about the dynamic refresh monitors, for instance, in the case of a GTX 980 and a ROG Swift monitor, you can use one flagship video card with the G-sync monitor and that's all you need for great gaming performance.
No more multi-card setup is needed to crank the frame rates out of this world on a high-refresh monitor in order to minimize tearing.
As long as you have a card that keeps frame rates near 30 and above at a given resolution and detail level, you get great performance with the ROG Swift.
With this monitor, you're going to have to keep the frame rates consistently above 48 fps to get equivalent performance with this LG monitor. This may seem easy with most titles and an 290 or 290x, but like I said earlier, try something like Crysis 3 or Tomb Raider and you'll find yourself below 48 fps pretty often.
Crysis 3 and Tomb Raider run above 48 FPS quite easily on a 290X... just not with every possible setting maxed out (e.g. not using SSAA). But at Ultimate settings, 2560x1080, Tomb Raider rand 72.5 FPS average and a minimum of 58 FPS.
Crysis 3 meanwhile is quite a bit more demanding; at max (Very High, 4xMSAA) settings it ran 33.2 FPS average, with minimum of 20.1 FPS. Dropping AA to FXAA only helps a bit, 40.9 FPS average and 25.4 minimums. Drop machine spec from Very High to High however and it bumps to 60.6 FPS average and 45 FPS minimum. If you want to drop both Machine Spec and Graphics to High, 48+ FPS isn't a problem and the loss in visual fidelity is quite small.
And at the end there is no 21:9 gsync monitor, so that 980 will be useless. Also the swift costs about 400 more then the 29um67 ( which should be looked at not the 34 model as the ppi is absolute shit ) and the 980 costs 200 more then a 290x. That means it's pretty much paying double the price in total.
Is it worth it sure, but the price is just way to expensive and you won't have a 21:9 screen, but a 1440p screen which will result in needing more performance then a 1080p ultra wide screen.
Getting 48 fps should be looked at getting a stable 60 fps. Even a 670 can run ~40 fps in crysis 3 on a 3440x1440 ultra wide screen resolution, a single 290 won't have issue's with maintaining 48 fps if you drop the settings a few notch. Most ultra settings are just there to kill performance anyway for people to keep on buying high performance cards.
The 29um69 in my view is a solid product and a good cheap alternative 21:9 which does perform solid. the 14 ms they talk about isn't grey to grey ms, its 5ms. Which for IPS is pretty much the best you can get. This screen is about as fast with input lag as any 5ms nt gaming monitor.
"FreeSync actually has a far wider range than G-Sync so when a monitor comes out that can take advantage of it it will probably be awesome."
That's completely false. Neither G-Sync nor the Adaptive-Sync spec have inherent limitations to frequency range. Frequency ranges are imposed due to panel specific limitations, which vary from one to another.
1) 27 hz range isn't a issue, you just have to make sure you game runs at 48+ fps at any time, which means you need to drop settings until you hit 60+ on average in less action packed games and 75 average on fast paced packed action games which have a wider gap with low fps.
The 75hz upper limit isn't a issue as you can simple use msi afterburner to lock it towards 75 fps.
The 48hz should actually have been 35 or 30, it would make it easier for the 290/290x for sure and you can push better visuals. But the screen is a 75hz screen and that's where you should be aiming for.
This screen will work perfectly in games like diablo 3 / path of exile / mmo's which are simplistic gpu performance games and will push 75 fps without a issue.
For newer games like witcher 3, yes you need to trade off a lot of settings to get that 48 fps minimum, but at the same time you can just enable v-sync and deal with the additional controlled lag from those few drops you get in stressing situations. You can see them as your gpu not being up to par. crossfire will happen at some point.
2) Extra features will cost extra money, as they will have to write additional stuff down, write additional software functions etc. It's never free, it's just free that amd gpu's handle the hardware side of things instaed of having to buy licenses and hardware and plant them into the screens. So technically specially in comparison towards nvidia it can be seen as free.
The 29um67 is atm the cheapest freesync monitor on top of it, it's the little brother of this screen, but for the price and what it brings it's extremely sharp priced for sure.
I'm also wondering why nobody made any review on that screen tho, the 34inch isn't great ppi wise while the 29inch is perfect for that resolution. But oh well.
3) In my opinion the 34 isn't worth it, the 29um67 is where people should be looking at, with a price tag of 330 atm, it's basically 2x cheaper if not 3x then the swift. There is no competition.
I agree that input lag is really needed for gaming monitors and it's a shame they didn't spend much attention towards it anymore.
All with all the 29um67 is a solid screen for what you get, the 48 minimum is indeed not practical, but if you like your games hitting high framerates before anythign else this will surely work.
It seems like the critical difference between FreeSync and GSync is that FreeSync will likely be available on a wide-range of monitors at varying price points, whereas GSync is limited to very high-end monitors with high max refresh rates, and they even limit the monitors to a single input only for the sake of minimizing pixel lag.
I like AMD's approach here, because most people realistically aren't going to want to spend what it costs for a GSync-capable monitor, and even if the FreeSync experience isn't perfect with the relatively narrow refresh rate range that most ordinary monitors will support, it's better than nothing.
If somebody who currently has an nVidia card buys a monitor like this one just becuase they want a 34" ultrawide, maybe they will be tempted to go AMD for their next graphics upgrade, because it supports adaptive refresh rate with the display that they already have.
I think ultimately that's why nVidia will have to give in and support FreeSync. If they don't, they risk effectively losing adaptive sync as a feature to AMD for all but the extreme high end users.
Additionally, it's obvious by the frequency range limitation of this monitor that the initial implementation of the freesync monitors is not quite up to par. If this technology is so capable, why limit it out of the gate?
LG appears to have taken the existing 34UM65, updated the scaler (maybe a new module, maybe just a firmware update), figured out what refresh rates the existing panel would tolerate, and kicked the 34UM67 out the door at the same initial MSRP as its predecessor.
And that's not necessarily a BAD approach, per se, just one that doesn't fit everybody's needs. If they'd done the same thing with the 34UM95 as the basis (3440x1440), I'd have cheerfully bought one.
So freesync is low fps sunk. Another amd failure, no surprise there. Plus high fps sunk. Just a tiny near worthless middle ground - again...
Now you know why nVidia actually made technology happen with new hardware, while amd claimed it is everyone else's job to do it, to make it work for amd. Freesync is only free for amd, no one else.
Don't forget it doesn't work with most games because the aspect ratio is wrong. ROFL Epic failure # xxx for amd. I can hear it now- "It's not amd's fault"... blah blah blah blah - yeah they sure "worked with" the "industry" didn't they. There's probably a boatload of unresponded to emails and phone messages sitting in the mad to do box - well too late again amd. It's someone else's job, right...
So... because this one monitor isn't to your liking, FreeSync is an "epic failure" for AMD? That's some stellar logic right there.
Those not hopelessly poisoned by fanboi-ism will note that there's no lack of released and pending FreeSync monitors running a wide range of sizes, aspect ratios, refresh rate ranges, and panel technologies.
Whats wrong with 21:9 aspect ratio? All but the oldest games have no problem taking full use of this aspect ratio. Have managed to enable it on may of my old favorites and new games have no trouble with it whatsoever. So 21:9 aspect is nothing to be afraid of.
And that list is flat out wrong. For example, it lists AC 1 & 2 as not supporting ultra-widescreen, but both of them work with ultrawidescreen resolutions "out of the box". If you'd like, I can take screenshots for you.
-Watch out for overdrive. I think on the older Samsungs with it, it made input lag horrendous. Overdriving might always require at least 1 extra frame of buffering/processing.
-For gaming, I think the higher the color gamut the better. Allowing a full range of saturated colors is more realistic for your eyesight. I see it as ok for 3D rendering, which is different from viewing pictures saved on a reduced gamut space.
-It's inexcusable to have no height adjustment on a huge monitor like this one.
-For general desktop use, the AOC Q2770PQU 27" 2560x1440 seems better. Vertical space is too important.
It would have been helpful to list in the article what the "consistent" constrast ratio was. I'd be guessing it was more likt 750/800:1 given IPS performance in the past but while low and high are very useful, knowing where someone will likely land being somewhere in the middle of the road would be useful to readers. Only making a suggestion. I am always grateful for the things you DO include in your reviews and I read them pretty through and through.
There are no VESA mounting holes on this monitor. Could you verify that you reviewed the correct one? I bought a "34UM67" and did not find any VESA mounting holes. The pictures on your site shows no mounting holes either.
How did he flatten the game curve when the monitor has no gamma controls? I bought this monitor and my gamma looks exactly like his pre calibration gamma image (starting high ending low) but as thre is NO way to calibrate gamma (only colour / white balance) I was unable to correct it? Anyone care to explain? Jarred?
My settings: Backlight set to 20 (120cd/m2) Black adjuster set to 0 Using i1Display Pro Power savings etc all turned off Colour calibration is good (all under delta 1.6) Colour temp is spot on 6500k Grey scale delta error all less than 1 Gamma set to "1" in menu
Gamma average is 2.2 BUT it's a diagonal line \ starting high (at 2.4) and ending at 1.9. As there is no 10point gamma control I am unable to figure out how to flatten the gamma as there is no gamma controls? My HDTV has 10pt gamma control so I can raise 10/20/30 and lower 70/80/90 to flatten a curve, but this monitor has absolutely NO (ZERO) gamma controls so how on earth can he flatten it to such a decent flat line? I'm baffled? Unless he used the dynamic contrast adjuster (black level adjuster) and/or used his GFX card to make adjustments to his output, I'm unsure how he was able to do this. I'd love to know though if anyone can enlighten me :)
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
96 Comments
Back to Article
inighthawki - Tuesday, March 31, 2015 - link
You guys really need to get the hardware for measuring input lag. For most displays this would not be a big deal to not know, since most are reasonable for basic activities. But when reviewing a gaming display, this is a pretty critical thing to know.Eidigean - Tuesday, March 31, 2015 - link
I completely agree, input lag is very important. I still use my ZR30w IPS display which does not have a scaler or OSD, and the input lag is much lower than the Dell U3011.From Brian Klug's review...
http://www.anandtech.com/show/4070/dell-u3011-revi...
The U3011's scaler / OSD added 12ms of latency compared to the XR30w.
Souka - Tuesday, April 7, 2015 - link
I game on a 24" Sony CRT computer monitor.... my lag is in nano-seconds. :)(Well, many years ago I was using one)
Kinda miss those screens
willis936 - Tuesday, March 31, 2015 - link
Honestly. It's months overdue. It's arguably one of the most important things to !measure on any display, let alone ones with gamer oriented features (cutting edge adaptive sync technologies).jimjamjamie - Wednesday, April 1, 2015 - link
I'll even disable uBlock for a few weeks to help raise funds.Byte - Thursday, April 2, 2015 - link
Darn almost peed my pants then saw it was only 1080. Was kinda surprised when i saw DP 1.2a and HDMI 1.3 but that explained it. Give us 1440 curved with freesync and its probably pinnacle for easily 5 years till OLEDs take over.okashira - Thursday, April 2, 2015 - link
I would have been embarrassed to post any kind of FreeSync or FreeSync vs Gsync review without exact input lag numbers.blanarahul - Tuesday, March 31, 2015 - link
*sniff* *sniff* Nobody cares about the high contrast ratio VA panels.MrCommunistGen - Tuesday, March 31, 2015 - link
I'd love to pick up a G-Sync or Freesync version of the BenQ BL3200PT if they could pull off even a slightly wider range of refresh rates than we see here in this review. 40Hz to 80Hz would allow them to double strobe every frame at 40Hz or below.Samus - Wednesday, April 1, 2015 - link
Interestingly, what I take away from this review, is AMD screwed up one of the benefits of G-sync, that is, the range of refresh. Even though this monitor is capped at 75hz, ie 75FPS, even with vsync off it should know better than to drop below 60FPS/60HZ because tearing is more annoying than judder at lower FPS because the tearing stays on the screen longer since the refresh rate is lower. That's a real oversight...Personally I just stick with 144hz panels (all of which are TN unfortunately) and always have vsync on. With a powerful enough card that never drops below 60+fps it is butter smooth and looks smooth as butter.
mobutu - Wednesday, April 1, 2015 - link
There's one really good 144Hz IPS panel/monitor: http://www.tftcentral.co.uk/reviews/acer_xb270hu.h...3DVagabond - Wednesday, April 1, 2015 - link
It's not an AMD display. It's LG. AMD only makes the cards that can do dynamic refresh rates with the DP1.2a standard. Whatever the specs or features are for the panel are at the monitor manufacturers discretion.Murloc - Wednesday, April 1, 2015 - link
that's one advantage of G-Sync: more control on monitor features since they can just refuse to license the board and it protects the brand by giving it a premium feel (putting the horrible RMA rates I heard about the swift aside), and people fall for it.medi01 - Thursday, October 22, 2015 - link
There is no such advantage, 1.2a is a standard, AMD FreeSync is a sticker which they can decide to give or not.Ubercake - Wednesday, April 1, 2015 - link
I agree. This implementation of adaptive sync is pretty bad. Frame rates in games like Battlefield 4 are often above 90 while playing and then can drop into the 50s at different points. With most games the frame rates are all over the place based on GPU demand and don't fit into this 27Hz/27fps range.dragonsqrrl - Wednesday, April 1, 2015 - link
As mobutu said Acer currently has a 144Hz IPS G-Sync monitor on the market, and to top it all off it's 27" 1440p. Linus recently did a great video review of the XB270HU:https://www.youtube.com/watch?v=_LTHr96NueA
blanarahul - Tuesday, March 31, 2015 - link
*sniff* *sniff* Nobody cares about the high contrast ratio VA panels.jjj - Tuesday, March 31, 2015 - link
At just 1080p feels like a waste to go there. Even 1440p doesn't feel like enough if you are gonna invest this much in a new screen.Anyway, it might be wise to include screen dimensions and screen area in the specs. With such huge differences in AR, the diagonal is misleading. Hell, i would even chart price per square cm but i don't expect you to do that. A 34 inch 16:10 screen would be almost 25% bigger than this one, or a 30inch 16;10 is almost the same area. In a better world regulators would force retailers to properly display screen dimensions, 99.9% of consumers don't realize the differences in size that come with AR.
xthetenth - Tuesday, March 31, 2015 - link
3440x1440 is really worth it because it's the best resolution you can get for working without messing around with scaling (and frankly it's better than the ones you'd want to use with scaling in most regards for functionality)jjj - Tuesday, March 31, 2015 - link
At 650$ it's a 5 years or more purchase not 1-2 years. And that's mostly 4k territory.Buying a 27inch 1080p IPS at 200$ on an easy to find deal is ok for 1-2 years of usage but this is a lot more. 1440p is much better than 1080p ofc but it also costs a lot more and you end up in a similar situation.
xthetenth - Wednesday, April 1, 2015 - link
For working purposes I would not consider a 16:x 4k an upgrade from 3440x1440 at all. I would be trading sufficient x space to have a third item up or a wide item and a narrow one up at the same time in return for a small amount of y space that doesn't make a meaningful difference. Past roughly 1200 pixels tall, 21:9 is by far the best aspect ratio for work. By 1200 pixels, there's plenty of y space that added information by increasing y space is facing seriously diminishing returns, while x space is starting to go from two pretty wide windows to three windows side by side, which is still giving significant returns.Of the current selection of monitors, I would definitely choose 3440x1440 to keep for 5 years, and spending that much tends to come with a very nice, calibrated screen. A $300-$400 2560x1440 isn't the same quality screen.
wweeii - Tuesday, March 31, 2015 - link
Theoretically even if you drop below 48hz it shouldn't be all bad.Between 16 and 24 fps, you can just triple the refresh rate, 48-75hz would work just fine without tearing.
Between 24 and 37 fps, you double the refresh rate, so no problem either.
You would only have a problem between 37 to 48 FPS, which is unfortunate.
Soulwager - Tuesday, March 31, 2015 - link
But AMD isn't doing that, and the VRR window is too small to do window shifting. If you want to display every frame on time you need a max frame interval needs to be greater than your frametime variance plus double the minimum frame interval.Soulwager - Tuesday, March 31, 2015 - link
You can test input lag with inexpensive hardware, for example, an arduino with native USB that emulates a mouse input and measures a subsequent brightness change with a photoresistor.Ryan Smith - Tuesday, March 31, 2015 - link
If you could, please shoot me an email.Soulwager - Tuesday, March 31, 2015 - link
Done.OrphanageExplosion - Wednesday, April 1, 2015 - link
Is there a link with an explanation for this somewhere so we can all take a look at this idea?Soulwager - Thursday, April 2, 2015 - link
Yes, here's a forum post: http://forums.blurbusters.com/viewtopic.php?f=10&a...jjj - Tuesday, March 31, 2015 - link
This gave me an idea, a Cypress PSoC board instead of Arduino could also work and maybe you could make a similar device to test touch responsiveness in phones and tabs. Cypress makes touch controllers so maybe they would help you out with some coding to enable you to test touch responsiveness. You could at least try. Guess Arduino started with Atmel chips and Atmel is also one of the major touch controller players so you could try to ask for their help too.cbrownx88 - Tuesday, March 31, 2015 - link
Yes - please email him! lolwillis936 - Wednesday, April 1, 2015 - link
It's worth mentioning that this wouldn't be good test methodology. Youd be at the mercy of how windows is feeling that day. To test monitor input lag you need to know how long it takes between when a pixel is sent across displayport or whatever to when it is updated on the display. It can be done without "fancy hardware" with a CRT and a high speed camera. Outside of that you'll need to be handling gigabit signals.willis936 - Wednesday, April 1, 2015 - link
Actually it can still be done with inexpensive hardware. I don't have a lot of experience with how low level you can get on the display drivers. Uou would need to find one that has the video transmission specs you want and you could dig into the driver to give out debug times when a frame started being sent (I could be making this unnecessarily complicated in my head, there may be easiest ways to do it). Then you could do a black and white test pattern with a photodiode to get the response time + input lag then some other test patterns to try to work out each of the two components (you'd need to know something about pixel decay and things I'm not an expert on).All of the embedded systems I know of are vga or hdmi though...
Murloc - Wednesday, April 1, 2015 - link
I saw some time ago that some company sold an affordable FPGA development board with video output.Maybe that would work.
Soulwager - Wednesday, April 1, 2015 - link
You can still calibrate with a CRT, but you can get thousands of times more samples than with a high speed camera(with the same amount of effort). USB polling variance is very easy to account for with this much data, so you can pretty easily get ~100 microsecond resolution.willis936 - Wednesday, April 1, 2015 - link
100 microsecond resolution is definitely good enough for monitor input lag testing. I won't believe you can get that by putting mouse input into a black box until I see it. It's not just windows. There's a whole lot of things between the mouse and the screen. anandtech did a decent article on it a few years back.http://www.anandtech.com/show/2803/7
Soulwager - Thursday, April 2, 2015 - link
Games are complicated, but you can make a test program as simple as you want, all you really need to do is go from dark to light when you give an input. And the microcontroller is measuring the timestamps at both ends of the chain, so if there's an inconsistency you haven't accounted for, you'll notice it.AnnonymousCoward - Friday, April 3, 2015 - link
If Windows adds unpredictable delays, all you need to do is take enough samples and trials and compare averages. That's a cool thing about probability.Ryan Smith - Wednesday, April 1, 2015 - link
CRTs aren't a real option here unfortunately. You can't mirror a 4K LCD to a CRT, and any additional processing will throw off the calculations.invinciblegod - Tuesday, March 31, 2015 - link
Having proprietary standards in pc gaming accessories is extremely frustrating. I switch between AMD and nVidia every other generation or so and I would hate for my monitor to be "downgraded" because I bought the wrong graphics card. I guess the only solution here is to pray for nVidia to support Adaptive-Sync so that we can all focus on one standard.invinciblegod - Tuesday, March 31, 2015 - link
I assume you didn't encounter supposed horrible backlight bleed that people seem to complain about on forums. That (and the currently proprietary nature of freesync until intel or nvidia supports it) is preventing me from buying this monitor.Crunchy005 - Wednesday, April 1, 2015 - link
more unsupported rather than proprietary nature. Proprietary seems to get thrown around to much.p1esk - Tuesday, March 31, 2015 - link
Double the resolution, then get the price down to $499, and I will consider it.wolrah - Tuesday, March 31, 2015 - link
"the screen went black and never came back"*snicker*
Asmodian - Tuesday, March 31, 2015 - link
Isn't FreeSync disabled when over the VRR? Your in game (F1) ghosting test is with FreeSync disabled. What is the ghosting like with FreeSync active? I understand overdrive is forced off whenever FreeSync is active.JarredWalton - Tuesday, March 31, 2015 - link
Behavior over the max VRR depends on the game setting -- it can be either VSYNC on (which will effectively cap FPS at max VRR) or VSYNC off. I personally leave it off, as I like being able to run at higher FPS.Dribble - Tuesday, March 31, 2015 - link
And his other point about ghosting - on other freesync monitors overdrive gets disabled when freesync is on, is that the case here?JarredWalton - Wednesday, April 1, 2015 - link
I took plenty of ghosting pictures to check this out as much as possible. From what I can see, overdrive ("Response Time" in the LG OSD) is fully active with or without FreeSync.Soulwager - Tuesday, March 31, 2015 - link
"AMD tells us that they drive a display at its max refresh rate when the frame rate drops below the cutoff"Could you test that? PCPer said the display was staying at it's MINIMUM refresh rate when your framerate drops below the cutoff.
JarredWalton - Tuesday, March 31, 2015 - link
Sadly I don't have any equipment suitable for testing the actual refresh rate, which is why I say "claims". Right now, I'm pretty sure that if you fall below 48 FPS the display refreshes at 48Hz. I suspect it's something AMD can change with driver updates. We're still waiting of CrossFire FreeSync as well, so maybe the next driver update will alter the way this works.dragonsqrrl - Tuesday, March 31, 2015 - link
This is not a good variable refresh rate display, plain and simple. In fact I would argue it's not a good desktop monitor in general. I think the fact that this monitor even came to market illustrates the difference between Nvidia and AMD's strategy and design philosophy.hammer256 - Wednesday, April 1, 2015 - link
Hm, just curious, which design philosophies are you speaking of?Spoelie - Wednesday, April 1, 2015 - link
NVIDIA ~ apple like, control the entire customer experience, at a priceSee this: http://www.forbes.com/sites/jasonevangelho/2015/03...
AMD ~ open standards, it's the monitor baker's responsibility not to deliver a shitty display.
Problem is that, for now, the entire G-SYNC experience is superior and no easy fix for that. See:
http://www.pcper.com/reviews/Graphics-Cards/Dissec...
http://www.tftcentral.co.uk/reviews/acer_xb270hu.h...
Crunchy005 - Wednesday, April 1, 2015 - link
Windows open standards up to the computer baker to deliver good hardware, and up to all the hardware manufacturers to deliver good drivers. Apple computers do have a very nice benefit, although I feel still more open then nvidia at times. Although proprietary and closed isn't necessarily helping apple computers in the gaming world so controlling everything isn't always good.Antronman - Tuesday, March 31, 2015 - link
>Freesync>Ultrawide
>14ms response time
>$649 MSRP
Who's going to buy this?
medi03 - Wednesday, April 1, 2015 - link
People who were after dual monitor setup?jabber - Wednesday, April 1, 2015 - link
Quite handy for video editing enthusiasts too.Antronman - Wednesday, April 1, 2015 - link
Cheaper and higher resolution with two 1080 monitors.Same cost for two 1440p monitors.
sibuna - Wednesday, April 1, 2015 - link
lots of people, I have the 34UM95 (no interest in any of the "sync" techs) the monitor replaces 2 27" 1440p monitorsim never going back
bizude - Sunday, April 5, 2015 - link
14ms *full* response time. 5ms GTG, which is standard. Not the fastest, for sure, but plenty fast enough for gaming. The only people who won't be satisfied by this response time are CS:GO addicts.mobutu - Wednesday, April 1, 2015 - link
You should really test this one, 144Hz IPS:There's one really good 144Hz IPS panel/monitor: http://www.tftcentral.co.uk/reviews/acer_xb270hu.h...
wigry - Wednesday, April 1, 2015 - link
In the article, it was said that 34" ultra wide screen is enormous - bigger than most people are used to. Well it might be at first glance, but you get used to it within a day or two. However lets see what a 34" monitor really is?The 34" monitor with 21:9 aspect ratio is nothing more than standard 27" display, 16:9 aspect and 2560x1440 resolution with extra 440 pixels added to both sides to extend the width to get 21:9 aspect ratio.
Therefore if you are used to for example 30" 16:9 display then going to 34" 21:9 is going backwards to smaller display.
I personally upgraded from 23" 16:9 display to 34" 21:9 curved ultra wide (Dell U3415W) and am very satisfied.
Anyhow the 34" display is not enormous, just a bit bigger - if you are used to 27" then it is athe same but a bit wider, if you are used to 30" then it is already smaller display.
dragonsqrrl - Wednesday, April 1, 2015 - link
"The 34" monitor with 21:9 aspect ratio is nothing more than standard 27" display, 16:9 aspect and 2560x1440 resolution with extra 440 pixels added to both sides to extend the width to get 21:9 aspect ratio."Except this monitor doesn't have a 3440x1440 panel, it's 2560x1080...
"Therefore if you are used to for example 30" 16:9 display then going to 34" 21:9 is going backwards to smaller display."
What do you mean by smaller? He stated in the conclusion that it's significantly wider than his old 30" monitor.
JarredWalton - Wednesday, April 1, 2015 - link
Exactly. The LG 34UM67 measures 32.5" wide; my old 30" WQXGA measures 27.5" wide. On most desks, a five inch difference is quite noticeable, and going from a 27" display that was 25.5" wide makes it even more noticeable. Is it bad to be this big, though? Well, not if you have the desk space. I still want the 3440x1440 resolution of the 34UM95, though.wigry - Thursday, April 2, 2015 - link
Well all I can say, that for many, the vertical space is more important than horizontal space. HMany refuse to go from 1200 to 1080 vertical pixels regardless of the width. So if converting to ultra wide screen, watch out for the vertical dimensions, both physical as pixel count and make sure that you are willing to make the necessary compromises.Also regarding the reference to 27" monitor, I again took my own Dell as an example (for some reason assumed that all LG panels are also 1440 px high). However as the height is 1080 then it is compareable to 1920x1080 display that is streched to 2560 pixels (320 pixels added to both sides)
Ubercake - Wednesday, April 1, 2015 - link
I love LG IPS panels. My television is an LG IPS type. They are among the best I've seen. Color accuracy is not something I consider important while gaming so I don't get hung up on this aspect. Also, viewing angles are important if you don't sit directly in front of the monitor when you game, but who in the heck doesn't sit dead center while they game? Viewing angle is another of the less important aspects of a gaming monitor.This monitor offers far better contrast than any G-sync monitor so far and contrast is really important when your enemy is camped out in the shadows in a multi-player FPS and should absolutely be considered when looking for a gaming monitor. I also like the resolution/aspect ratio of this monitor for gaming.
Three things that would keep me from buying this monitor:
1) Can't go below 48 fps or above 75 fps without introducing tearing. Games like Crysis 3 or those games using TressFX like Tomb Raider most definitely bring framerates below the 48Hz/48 fps horizon with details and AA cranked for a 290x or GTX 980. Check multiple benchmarks around the web and you'll see what I mean. Why bother? You have a range of 27fps (48Hz-75Hz) in which your games have to run in order to get any free sync advantage.
2) AMD stated there wouldn't be a price premium, yet there is. All the hype prior had every AMD rep saying there is no added cost to implement this technology, yet there really is because there is a change to the production process. Apparently, many manufacturers have not bought into the adaptive sync "standard" yet.
3) The color gamut on the ROG Swift is slightly better than this IPS monitor. I stated color accuracy is not that important to me, but if I'm buying an IPS monitor, it better provide better color accuracy than a TN.
Also, input lag is a measurable aspect. Not sure why this was essentially left out of the review.
Crunchy005 - Wednesday, April 1, 2015 - link
Ya i agree that the 27Hz range is dumb, it needs a wider range and freeSync can support a much wider range. FreeSync actually has a far wider range than G-Sync so when a monitor comes out that can take advantage of it it will probably be awesome. I'm sure the added cost premium is the manufacturer trying to make a few bucks off of something "new" not really AMDs fault but nothing they or anyone can do about it except LG. Also might cost more because you get a different scaler that might be higher quality than another who knows.Ubercake - Wednesday, April 1, 2015 - link
Potential is nothing unless realized.This is a poor implementation with that limited frequency range.
I find the best part about the dynamic refresh monitors, for instance, in the case of a GTX 980 and a ROG Swift monitor, you can use one flagship video card with the G-sync monitor and that's all you need for great gaming performance.
No more multi-card setup is needed to crank the frame rates out of this world on a high-refresh monitor in order to minimize tearing.
As long as you have a card that keeps frame rates near 30 and above at a given resolution and detail level, you get great performance with the ROG Swift.
With this monitor, you're going to have to keep the frame rates consistently above 48 fps to get equivalent performance with this LG monitor. This may seem easy with most titles and an 290 or 290x, but like I said earlier, try something like Crysis 3 or Tomb Raider and you'll find yourself below 48 fps pretty often.
JarredWalton - Wednesday, April 1, 2015 - link
Crysis 3 and Tomb Raider run above 48 FPS quite easily on a 290X... just not with every possible setting maxed out (e.g. not using SSAA). But at Ultimate settings, 2560x1080, Tomb Raider rand 72.5 FPS average and a minimum of 58 FPS.Crysis 3 meanwhile is quite a bit more demanding; at max (Very High, 4xMSAA) settings it ran 33.2 FPS average, with minimum of 20.1 FPS. Dropping AA to FXAA only helps a bit, 40.9 FPS average and 25.4 minimums. Drop machine spec from Very High to High however and it bumps to 60.6 FPS average and 45 FPS minimum. If you want to drop both Machine Spec and Graphics to High, 48+ FPS isn't a problem and the loss in visual fidelity is quite small.
gatygun - Tuesday, June 30, 2015 - link
And at the end there is no 21:9 gsync monitor, so that 980 will be useless. Also the swift costs about 400 more then the 29um67 ( which should be looked at not the 34 model as the ppi is absolute shit ) and the 980 costs 200 more then a 290x. That means it's pretty much paying double the price in total.Is it worth it sure, but the price is just way to expensive and you won't have a 21:9 screen, but a 1440p screen which will result in needing more performance then a 1080p ultra wide screen.
Getting 48 fps should be looked at getting a stable 60 fps. Even a 670 can run ~40 fps in crysis 3 on a 3440x1440 ultra wide screen resolution, a single 290 won't have issue's with maintaining 48 fps if you drop the settings a few notch. Most ultra settings are just there to kill performance anyway for people to keep on buying high performance cards.
The 29um69 in my view is a solid product and a good cheap alternative 21:9 which does perform solid. the 14 ms they talk about isn't grey to grey ms, its 5ms. Which for IPS is pretty much the best you can get. This screen is about as fast with input lag as any 5ms nt gaming monitor.
It also helps that it features 75hz.
gatygun - Tuesday, June 30, 2015 - link
addition, it's pretty much the best gaming 21:9 monitor on the market, for a cheap price on top of itdragonsqrrl - Wednesday, April 1, 2015 - link
"FreeSync actually has a far wider range than G-Sync so when a monitor comes out that can take advantage of it it will probably be awesome."That's completely false. Neither G-Sync nor the Adaptive-Sync spec have inherent limitations to frequency range. Frequency ranges are imposed due to panel specific limitations, which vary from one to another.
bizude - Thursday, April 2, 2015 - link
Price Premium?! It's 50$ cheaper than it's predeccessor, the 34UM65, for crying out loud, and has a higher refresh rate as well.AnnonymousCoward - Friday, April 3, 2015 - link
The $ goes on the left of the number.gatygun - Tuesday, June 30, 2015 - link
1) 27 hz range isn't a issue, you just have to make sure you game runs at 48+ fps at any time, which means you need to drop settings until you hit 60+ on average in less action packed games and 75 average on fast paced packed action games which have a wider gap with low fps.The 75hz upper limit isn't a issue as you can simple use msi afterburner to lock it towards 75 fps.
The 48hz should actually have been 35 or 30, it would make it easier for the 290/290x for sure and you can push better visuals. But the screen is a 75hz screen and that's where you should be aiming for.
This screen will work perfectly in games like diablo 3 / path of exile / mmo's which are simplistic gpu performance games and will push 75 fps without a issue.
For newer games like witcher 3, yes you need to trade off a lot of settings to get that 48 fps minimum, but at the same time you can just enable v-sync and deal with the additional controlled lag from those few drops you get in stressing situations. You can see them as your gpu not being up to par. crossfire will happen at some point.
2) Extra features will cost extra money, as they will have to write additional stuff down, write additional software functions etc. It's never free, it's just free that amd gpu's handle the hardware side of things instaed of having to buy licenses and hardware and plant them into the screens. So technically specially in comparison towards nvidia it can be seen as free.
The 29um67 is atm the cheapest freesync monitor on top of it, it's the little brother of this screen, but for the price and what it brings it's extremely sharp priced for sure.
I'm also wondering why nobody made any review on that screen tho, the 34inch isn't great ppi wise while the 29inch is perfect for that resolution. But oh well.
3) In my opinion the 34 isn't worth it, the 29um67 is where people should be looking at, with a price tag of 330 atm, it's basically 2x cheaper if not 3x then the swift. There is no competition.
I agree that input lag is really needed for gaming monitors and it's a shame they didn't spend much attention towards it anymore.
All with all the 29um67 is a solid screen for what you get, the 48 minimum is indeed not practical, but if you like your games hitting high framerates before anythign else this will surely work.
twtech - Wednesday, April 1, 2015 - link
It seems like the critical difference between FreeSync and GSync is that FreeSync will likely be available on a wide-range of monitors at varying price points, whereas GSync is limited to very high-end monitors with high max refresh rates, and they even limit the monitors to a single input only for the sake of minimizing pixel lag.I like AMD's approach here, because most people realistically aren't going to want to spend what it costs for a GSync-capable monitor, and even if the FreeSync experience isn't perfect with the relatively narrow refresh rate range that most ordinary monitors will support, it's better than nothing.
If somebody who currently has an nVidia card buys a monitor like this one just becuase they want a 34" ultrawide, maybe they will be tempted to go AMD for their next graphics upgrade, because it supports adaptive refresh rate with the display that they already have.
I think ultimately that's why nVidia will have to give in and support FreeSync. If they don't, they risk effectively losing adaptive sync as a feature to AMD for all but the extreme high end users.
Ubercake - Thursday, April 2, 2015 - link
Right now you can get a G-sync monitor anywhere between $400 and $800.AMD originally claimed adding freesync tech to a monitor wouldn't add to the cost, but somehow it seems to.
Ubercake - Thursday, April 2, 2015 - link
Additionally, it's obvious by the frequency range limitation of this monitor that the initial implementation of the freesync monitors is not quite up to par. If this technology is so capable, why limit it out of the gate?Black Obsidian - Thursday, April 2, 2015 - link
LG appears to have taken the existing 34UM65, updated the scaler (maybe a new module, maybe just a firmware update), figured out what refresh rates the existing panel would tolerate, and kicked the 34UM67 out the door at the same initial MSRP as its predecessor.And that's not necessarily a BAD approach, per se, just one that doesn't fit everybody's needs. If they'd done the same thing with the 34UM95 as the basis (3440x1440), I'd have cheerfully bought one.
bizude - Thursday, April 2, 2015 - link
Actually the MSRP is $50 cheaper than the UM65gatygun - Tuesday, June 30, 2015 - link
Good luck getting 48 minimums on a 3440x1440 resolution on a single 290x as crossfire isn't working with freesync.FlushedBubblyJock - Thursday, April 2, 2015 - link
So freesync is low fps sunk.Another amd failure, no surprise there.
Plus high fps sunk.
Just a tiny near worthless middle ground - again...
Now you know why nVidia actually made technology happen with new hardware, while amd claimed it is everyone else's job to do it, to make it work for amd.
Freesync is only free for amd, no one else.
FlushedBubblyJock - Thursday, April 2, 2015 - link
Don't forget it doesn't work with most games because the aspect ratio is wrong.ROFL
Epic failure # xxx for amd.
I can hear it now- "It's not amd's fault"... blah blah blah blah - yeah they sure "worked with" the "industry" didn't they. There's probably a boatload of unresponded to emails and phone messages sitting in the mad to do box - well too late again amd.
It's someone else's job, right...
Black Obsidian - Thursday, April 2, 2015 - link
So... because this one monitor isn't to your liking, FreeSync is an "epic failure" for AMD? That's some stellar logic right there.Those not hopelessly poisoned by fanboi-ism will note that there's no lack of released and pending FreeSync monitors running a wide range of sizes, aspect ratios, refresh rate ranges, and panel technologies.
wigry - Thursday, April 2, 2015 - link
Whats wrong with 21:9 aspect ratio? All but the oldest games have no problem taking full use of this aspect ratio. Have managed to enable it on may of my old favorites and new games have no trouble with it whatsoever. So 21:9 aspect is nothing to be afraid of.bizude - Thursday, April 2, 2015 - link
You're an idiot. 99% of games out there work flawlessly with 21:9 monitors.AnnonymousCoward - Saturday, April 4, 2015 - link
"flawlessly"? No, 99% of the games listed here are not Hor+ Native. http://www.wsgf.org/mgl?page=1bizude - Sunday, April 5, 2015 - link
And that list is flat out wrong. For example, it lists AC 1 & 2 as not supporting ultra-widescreen, but both of them work with ultrawidescreen resolutions "out of the box". If you'd like, I can take screenshots for you.AnnonymousCoward - Saturday, April 4, 2015 - link
Jarred,-Watch out for overdrive. I think on the older Samsungs with it, it made input lag horrendous. Overdriving might always require at least 1 extra frame of buffering/processing.
-For gaming, I think the higher the color gamut the better. Allowing a full range of saturated colors is more realistic for your eyesight. I see it as ok for 3D rendering, which is different from viewing pictures saved on a reduced gamut space.
-It's inexcusable to have no height adjustment on a huge monitor like this one.
-For general desktop use, the AOC Q2770PQU 27" 2560x1440 seems better. Vertical space is too important.
Ethos Evoss - Saturday, April 4, 2015 - link
AMAZING !mlmcasual - Monday, April 6, 2015 - link
1080P=FAIL...FXi - Monday, May 25, 2015 - link
It would have been helpful to list in the article what the "consistent" constrast ratio was. I'd be guessing it was more likt 750/800:1 given IPS performance in the past but while low and high are very useful, knowing where someone will likely land being somewhere in the middle of the road would be useful to readers. Only making a suggestion. I am always grateful for the things you DO include in your reviews and I read them pretty through and through.KarenS - Friday, July 24, 2015 - link
There are no VESA mounting holes on this monitor. Could you verify that you reviewed the correct one? I bought a "34UM67" and did not find any VESA mounting holes. The pictures on your site shows no mounting holes either.Jiffybag - Sunday, October 11, 2015 - link
How did he flatten the game curve when the monitor has no gamma controls? I bought this monitor and my gamma looks exactly like his pre calibration gamma image (starting high ending low) but as thre is NO way to calibrate gamma (only colour / white balance) I was unable to correct it? Anyone care to explain? Jarred?Jiffybag - Sunday, October 11, 2015 - link
Game = gamma (auto correct got me) :)My settings:
Backlight set to 20 (120cd/m2)
Black adjuster set to 0
Using i1Display Pro
Power savings etc all turned off
Colour calibration is good (all under delta 1.6)
Colour temp is spot on 6500k
Grey scale delta error all less than 1
Gamma set to "1" in menu
Gamma average is 2.2 BUT it's a diagonal line \ starting high (at 2.4) and ending at 1.9.
As there is no 10point gamma control I am unable to figure out how to flatten the gamma as there is no gamma controls? My HDTV has 10pt gamma control so I can raise 10/20/30 and lower 70/80/90 to flatten a curve, but this monitor has absolutely NO (ZERO) gamma controls so how on earth can he flatten it to such a decent flat line? I'm baffled? Unless he used the dynamic contrast adjuster (black level adjuster) and/or used his GFX card to make adjustments to his output, I'm unsure how he was able to do this. I'd love to know though if anyone can enlighten me :)
Jiffybag - Sunday, October 11, 2015 - link
I have a 34UM67 and it has VESA mounting holes, but no gamma correction control? :-/rya - Monday, October 19, 2015 - link
has anyone tried overclocking this monitor or altering the freesync range? I'd love to run freesync from 9hz - 80hz (or higher) if possible.