Dell Leaks Details of a 24” UHD 4K (3840x2160) Monitor
anandtech.comWatch out for problems with the monitor. New ultra-high-def monitors often have problems such as lag or poor color accuracy.
Before buying this, you need to wait until a professional with a colorimeter and a lagmeter evaluates this monitor if you remotely care about color accuracy or gaming. Resolution isn't the only consideration when buying a new monitor.
> if you remotely care about color accuracy or gaming
Lots of hackers only care about having enough pixels to legibly represent characters in their tiny coding font.
Finally, a resolution that will let me fit enough text on the screen when combined with http://mrl.nyu.edu/~perlin/homepage2006/tinyfont/ !
Would appreciate if someone can post a screenshot, or an equivalent that doesn't require enabling Java.
is that font downloadable somewhere?
A possible alternative is Miniscule: http://www.myfonts.com/fonts/256tm/minuscule/
> Resolution isn't the only consideration when buying a new monitor.
Sure it is! Reading on a monitor is pure torture ever since I bought a Retina MBP. If this is anywhere close to being affordable (realistically: no) I'd buy it for that reason alone.
>> Resolution isn't the only consideration when buying a new monitor.
> Sure it is!
So you'd be fine with a 3840x2160 monitor that has a 5Hz refresh rate? Or one with a 500ms lag time? Or one that only displayed in grayscale, or in 8-bit color?
Resolution is perhaps the most important consideration, but by no means the only one.
Yes. Actually, I'd be excited about a UHD e-ink display (which is basically what you're describing.) It'd be perfect as a second monitor to throw documents up on to read at my desk, while I kept the primary for more interactive stuff.
So it needs to be UHD _and_ e_ink. There goes the "UHD is everything I need" argument
This would be a valid concern if it still would be 1999. FYI the T220 (3840×2400!) already had 41hz back in 2001. http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors
I have a T221 (not too expensive if you buy it on Yahoo Japan auctions via a suitable proxy). I was only able to get it up to 25Hz due to having only two DVI ports on my laptop docking station. I tried every thing known to mankind to get four DVI or two dual DVI and convert those to the special ports of the T221. No dice.
As it runs at 1920x2400@25 Hz you can't really watch a movie on the screen because in the middle of the screen where the two halves meet you have a weird effect of any movie due to, I guess, low frame rate and not completely sync'd outputs. I guess you could confine it to one half of the monitor but gosh, are you going to watch your movies in 11" ?
If the monitor will allow for 3840x2400@30Hz on DisplayPort 1.1 (which most people have -- 1.2 is rare, Intel got it in Haswell, and I think nVidia got to 1.2 maybe late 2011) that may be enough for coding and watching movies. Forget games. Time will tell. I will try when the monitor has a three figures price tag.
Also, 4K displays achieving 60Hz over DisplayPort 1.2 are http://www.hardwareluxx.com/index.php/reviews/hardware/vgaca... using the same tiling dual display (at least currently), with each tile being 1920x2160@60Hz and producing weird effects as thearticle shows.
Also, it will be fun and games when, unlike with DVI where figuring out whether you had SL or DL was relatively easy you will now need to figure out whether you have DisplayLink 1.1 or 2.0 with zero difference on the connector.
I can safely say I won't tolerate less than 20Hz, 50ms. Anything below that, even if the color 8-bit and a little wonky, and the viewing angle poor, and it is a bit dim, I'm taking it. I guess the other considerations are price and compatibility.
I think you are underestimating the weight of the word "torture" here. I mean sure it can be better but torture? Really?
I'm sure you're overestimating the use of casual language here.
I love my new Retina MBP, however the most annoying thing I found was dealing with so many non-retina things that look hideously blurry lol. I don't have a problem with the screen though, but weirdly enough in the last year or so I've had real problems with iPads. Like if I read or use it for even a short while, I get that nauseous headache feeling. Apparently it might have to do with LED screens.
Dealing with other lower-DPI screens is what I thought my problem would be, but the biggest issue I've found with moving to a Retina MacBook Pro are the drawbacks of the technology being so new.
Compared with my old CCFL-backlit desktop LCD, the Retina Display's backlight is far less uniform, and whites have a more readily perceptible color shift across the screen. It's not enough to qualify for a replacement (trust me, I tried), but it's nonetheless distracting enough to make me want to do most of my design work on my standard-DPI monitor.
The rMBPs I've run (including my own) across have had very uniform brightness and white color. I'm curious which manufacturer made the LCD in yours because--going from the usual problems with Apple LCDs.
If you aren't aware, you can tease out the LCD manufacturer via the model string and this command:
If it begins with LSN, it's a Samsung (which tend to be the good ones); if it's LP, it's an LG (which, unsurprisingly, tend to be the poor ones).ioreg -lw0 | grep "EDID" | sed "/[^<]*</s///" | xxd -p -rIt's an LG. No perceptible image retention, actually (the problem I was worried about), but I think I'd almost prefer it to these issues.
How do you even notice these things?
and whites have a more readily perceptible color shift across the screen
Is yours dysfunctional? Retina screen has honestly been a godsend.
It's a blessing and a curse. I'm sure at least 50% of people wouldn't even notice, and 90% wouldn't find it distracting. Alas, I do.
You could also have an issue with your neck or upper back. Try reading a somewhat heavy book in much the same position as you read an iPad.
To be honest I rather prefer a non-retina screen. I've got a chance to play with an rMBP and my wife has a Retina iPad mini.
The line details on UIs are too thin so contrast disappears and the text verticals are too thin so it hinders readability for me. iOS 7 admittedly doesn't help that but it's not for me.
Oh and according to my optician I have better than average sight.
How long would you stand a retina display that has only has 10% of the contrast of your current monitor?
True, but resolution so massively outweighs every other consideration (for me) that it might as well be the only.
In this new high-res world, virtually all desktop monitors are just broken. Ugh, I am typing these very words right now on a 30" Dell 3008FWP, and I want to gouge out my eyes...
Until the monitor supports at least 60hz I would say it's unusable for gaming.
With the 39" Seiki 4K "tv" at less than $500 shipped, the $3,500 pricing level on the Dells seems excessive. I'm certain the Dells are better monitors, but are they 700% better? If Seiki revs their model line-up to include display-port, all these official monitors will really be in trouble.
I was a little floored by the Seiki price drop. When it happened, I Tweeted at Seiki to ask if this is a precursor to a replacement unit that functions more as a monitor (60Hz, HDMI 2, no splash screen, DPMI on, no speakers, matte surface). That's wishful thinking, of course. The tweet was mostly just to put the demand in front of them. :)
Incidentally, my review of my wife's Seiki: http://tiamat.tsotech.com/seiki-4k
4K at 24" is really interesting, however. Not necessarily because I want to use a small monitor again, but rather because it's a sign that we are slowly inching toward high-DPI large form-factor displays. And I've been waiting for that to happen for a decade.
Next would be extending that pixel density to 30" (~6K) and larger (~8K). Then I will celebrate a bit.
Off-topic but thanks for the noscript message.
There's some price discrimination going on here, yes, but keep in mind that a) the PQ on that Seiki will be horrible; 2) at 39" it's way too large; and iii) as you note, it does not have a useful way to get a signal into it.
The win with a ~200dpi 24" 4k will be OS X-style Retina upscaling, not simply screen real estate.
(A) Horrible is a vast overstatement. Unless you need calibrated color, it is fine.
(2) Being too large is relative, my 30" 2560x1600 quickly became normal sized. For the additional pixels the additional inches of a 39" seems about right.
(iii) HDMI at 30Hz is just fine for anything but gaming. I used to have an IBM T221 that also ran at 30Hz and it was no problem at all for text and video. Some people expect there will be mouse lag, there wasn't.
I have an ASUS PQ321Q. Due to lack of support in OS X Mavericks for DP1.2 MST, OS X can only do 30Hz, but Windows on the same hardware can do 60Hz.
The difference is staggering. 30Hz is terrible, and the lag when moving the mouse, dragging a window, or scrolling text is perceptible and annoying.
Your monitor's electronics may be doing more processing at 30Hz (like frame-doubling) than at 60Hz thus adding latency that is not inherent in the lower refresh rate.
I used the T221 at 30Hz in a multi-monitor system with the other two monitors at 60Hz. I could drag a window so it straddled the 30Hz monitor and a 60Hz monitor. The DPI was severely mismatched so it wasn't very useful to straddle like that, but movement across the monitors was not obnoxious, not terribly fluid, but not annoying either.
I'm sure different people have different tolerances for latency, but I tend to think my tolerance is pretty low.
a) Not really. Obviously, Seiki isn't setting a benchmark for build quality. But it's also not "horrible." My wife's Seiki has zero dead pixels and is crisp and clear. Brightness isn't even around the edges, but "horrible" is going too far. It's far less horrible than a TN panel, for instance.
2) 39" is not way too large. I want 50" on my desktop.
iii) Yeah, HDMI 1.4 is limiting, but 30Hz makes it still usable for non-gaming. DisplayPort? I guess I won't balk at it, but just give me HDMI 2.
I am critical of the monitor for other reasons: a) it doesn't power up on DPMI on and has a splash screen, 2) it has a semi-glossy screen and professional monitors should be matte, and iii) it's HDMI 1.4.
30hz is not to me usable; nor is 39", and I'm used to Apple's build and picture quality, so I guess I should've prefaced my post with "IMHO."
Seiki is still HDMI/1.x and doesn't support 60FPS. Nor will it help gaming at it's limit refresh rate. The good is that they're putting pressure on the big brands. If it was HDMI/2.0, I'd be all over it.
A 4K 39" TV has less resolution than a 4K 24" monitor. Just do the math!
This isn't about real estate; it is easy to buy a big monitor, but about pixel density.
Your point about the importance of pixel density is a good one, but the pedantry is misplaced. Resolution in the context of display resolution has referred to pixel count, not pixel density, for decades.
Yes, that was a mistake on my part. I meant to just talk about pixel density and somehow wrote in resolution.
Well, a 4k 39" TV still has about a 12% higher pixel density than the 1920x1200 24 inch monitor I'm using right now.
Additionally, you can take that extra size and just sit a little bit further away from it.
That isn't going to work in my cubicle, or any kind of coding situation, really.
I used to use 2x30" very comfortably for coding, I don't think this is that tough.
I tried two 24 inches once, my neck got sore for a week. My cubicle is simply too small for it to work well.
Ah, yeah, I WFH in a home office with a standing desk. Right now I'm using 3x1080p + my rMBP panel and I find it pretty solid.
I have a Seiki 39in that I use as a second monitor for my rMBP - the PPI (110) is the same as the Apple 27in Thunderbolt and they have roughly the same picture quality once you calibrate the Seiki. 39 Inches of uninterrupted space is awesome the only downside I've noticed is some mouse lag under OSX which can be fixed with some quartz settings that then cause screen tearing, I've read it's not present under windows so maybe it's something that can be worked out with a software fix.
Exactly: the most interesting thing here isn't the 4k, but rather the 183.5ppi.
I really wish companies would stop selling monitors with their logo placed obnoxiously on the front. I don't care if they put the logo on any other part of the display, but why do they have to ruin the front of it? I really love looking at plain black slabs, with nothing to distract me from the content within.
I want to buy a couple of monitors and mount them. But there's just one small thing stopping me. I don't know if it's just me, but something feels wrong about rotating a monitor with such a prominent logo.
I think this is a problem too. I can understand the manufacturers wanting their logo on the bottom though.
If I was a screen manufacturer I would have a speaker bar on the bottom that has the logo and a few easy brightness controls on it - total Fisher Price usability.
I would then add in a feature for the pros - have it so the speaker/controls bar can be folded up under the screen.
In that way people that like their bling logos could have the logo on view, those that just want a panel can have no distractions.
If engineered nicely you could have USB and video inputs on the drop-down bar made accessible from the front.
Just paint the logo black. Or as close as you can get to the rest of the front's color. I'm doing this on all my Dells (using a paint marker), including a dot over the light of the power button.
When selling in retail how will people know whose monitor they are looking at? Logos serve many purposes, but in the wild they are a good source of marketing. Should we ask them to not brand even the box the CPU comes in?
At most they could make the logo easily removable, too me that would be a fair compromise.
I have recently gotten a taste of ultra high resolution monitors.
My coworker got the new Dell XPS 15, which has a QHD+ 3200x1800 screen. Just a heads up to coders, unless you plan to hunch your back or get new glasses, very few of you will enjoy the screen as much as you think you would.
Why's that? I picked up a QHD+ Samsung a few weeks ago and have been loving it for development work. All of the development tools I've needed so far have respected Windows' DPI scaling (which came set at 200% on the Samsung, making it easy to spot when a program failed to scale correctly).
The only major culprit so far has been Dropbox, which is infuriatingly frustrating to use at HiDPI. So bad it makes me want to move everything to SkyDrive or Google Drive.
How much do you actually interact with dropbox's UI? I would think that most of the time you would just be using it through the standard file/directory interaction things (file browsers, shell, etc).
Configuring selective sync (which was a must since this was on a laptop with a relatively small SSD) is particularly excruciating, and I do need to tweak that from time to time when large folders are added. The window that pops up when you click the system tray icon also has the scaling issue, which is something that I do need to interact with on a daily basis. Luckily, it's not quite a bad as the selective sync configuration, but it's still pretty rough.
I've been living on a Retina MBP, which has a comparably 4x resolution, and I've come to exactly the opposite conclusion: if you work with text, you need a super-dense screen ASAP. I don't know how Windows handles it, but in OS X, "Retina" support means that e.g. my Emacs windows look amazing -- even as Java-garbage like Intelli-J look like some sort of Motif abomination.
Um, what? IntelliJ has supported HiDPI since about a month after the Retina MBPs shipped. It looks great.
The intent with these kinds of high DPI screens is usually to use scaling and keep interface elements roughly the same physical size (like Apple does with the RMBP), not to make everything tiny and still pixelated. If you spend all day looking at text it looks fantastic on a high density display.
Support on Windows isn't perfect, but most programs handle it well by now.
Set DPI scaling to get it to 2560x1440 "equivalent"? Not a general solution since so many apps misbehave, but if you're coding and spending most of the time in the editor, you get the extra real estate, plus the extra crispness in font rendering.
With DPI scaling set to 2x in Windows 8.1, text editors look no different than on a 1600x900 screen.
I code all day in Sublime on a 2560x1440 13" laptop at 1.5x scaling (Asus UX301LA-DH71T).
3840 x 2160 in 24 inches = 183.58 pixels per inch, compared to 204 for the IBM T220/T221
We're almost back to 10 years ago, yay.
And in 10 years we may even return to CRT refresh rates.
I'm left wondering how long until we have mainstream video cards capable of driving games at these resolutions? Seems like we're a ways off...
4K is totally doable with a single GPU if you drop to using medium/high instead of very high/ultra quality settings for even the most demanding of games: http://hardocp.com/article/2013/11/11/geforce_gtx_780_ti_vs_...
For the majority of games 4K won't pose an issue whatsoever with a modern high end video card.
Oh yes, we are way, way off at this stage. Check this recent benchmark from Phoronix (on Linux, at least, Windows gaming may be a little better in performance...): http://www.phoronix.com/scan.php?page=article&item=linux_uhd...
With a Titan card from nVidia, maxing out at 20 FPS on Unigine, it's pretty depressing. We need to see a 2 to 3x fold performance increase in graphics cards for gaming to be realistic on 4k screens.
Bare in mind that the Titan is not the fastest GPU, the 290x and the 780ti are. And they're decent at 4K: http://www.hardocp.com/article/2013/11/11/geforce_gtx_780_ti...
Depends what you call decent. 15 FPS(minimum) at medium settings in Crysis3 is hardly what I'd call decent for the best graphics chip out there. I'd say it's playable at best, but far from being where you want to be.
No, they both had 24 FPS minimum at medium settings.
Minimum really isn't all that interesting, though. A single low spike won't ruin a gameplay session. The average FPS was 42, the low spikes were few and far between. It's absolutely playable - that's the entire basis for which the settings were picked btw.
Average FPS of 42 is OK, but in any serious FPS you'd want to have 60 FPS constantly at least. Any number lower than that and your shooting accuracy decreases greatly with the framerate.
So, playbable, probably, but enjoyable, not so sure about that.
Unigine is not a game and it does not in any way reflect real world game performance. The Heaven benchmark has proven time and time again to be entirely irrelevant, due in no small part to it's insane usage of tessellation.
that what GPU vendors have been waiting for, as with the current Full HD max resolutions there wasnt really a need to upgrade in the last few years.
most impressive stat to me: 1.07 billion colors
my how far we've come from the good ol' days of CRT
Err ... color resolution was never really the problem with CRT technology.
Yeah. I hope that I'll live to see non-CRT consumer displays with actual 5->10 ms response times and real blacks, but I'm not too optimistic.
An OLED screen will deliver that (pure black and sub-ms response time). And they are on sale now as televisions (albeit at scary prices along the lines of US$13000 for a 55" screen). 4k OLED screens with a 55" diagonal have also been demonstrated. From the sounds of things, new manufacturing techniques are being developed that will bring the prices down to sensible levels.
Here's a review of one of the screens currently on the market: http://www.digitalversus.com/tv-television/lg-55ea980w-p1619...
Strangely, the first few OLED TVs at size have been concave (presumably more "because they can be" rather than because they need to be). Flat screens are also hitting the market now: http://www.oled-info.com/oled_devices/tv
We'll see how long it takes for good OLED tech to make it into the consumer computer monitor market. When I was younger, it was predicted that we would have affordable 600dpi, sub ms switching time time LED computer monitors by the early to mid 2000s.
"The market" is often an obstructionist bitch.
There's a new colour space https://en.wikipedia.org/wiki/Rec._2020 coming for the UHDTV (4K TV) standard; gonna be a son of a gun.
And to think I thought the Thunderbolt display was pricey at $1k. I love it, but I don't think I could possible realize (or notice) the increase in capability unless working heavily in graphics and video, and that's assuming that the machine attached to it has a card that takes advantage of it.
The Monoprice 27" (using the same panel as the Apple Thunderbolt display) is under $400 this weekend: http://www.monoprice.com/Product?c_id=109&cp_id=10909&cs_id=...
I have one beside the same size thunderbolt display and I like the monoprice one better (it seems to have a more effective anti-reflective coating).
Not too mention the Thunderbolt Display is outdated with its MagSafe 1, USB 2.0 ports, still present FireWire and it's thicker than the new iMacs.
Thunderbolt displays come with a MagSafe 1 -> MagSafe 2 converter in the box now.
I'm aware of that. But it's not quite the same as an updated Thunderbolt Display that will come with a native MagSafe 2 cable.
I wonder if I can get this in Australia. I know about the Catleap monitors but I'd like to think that Monoprice does some QA
I'd be very happy with an 8-bit colour 2560x1600 24" monitor. It would be more affordable and way easier for graphics cards to run, why is no-one making one of these!?
The whole idea behind 4k is that you can feed it 1080p content and it will be displayed no worse than it would be on native 1080p display. In other words it allows easily swap between having conventional 24" 1080p performance or full 4k resolution, and this swapping can be relatively easily be done on a per-application basis, or possibly even more granually. Imagine eg having a WebGL context being pixel-doubled while the text on the same page being rendered at full resolution.
In comparison, in a 2560x1600 24" monitor you'd get either quite big/ugly double-pixels, or scaling artifacts of non-integer multiple scaling.
Dude. Sweet! If the price is reasonable, I'll buy 2-4 for my desktop. I'd love to have the knock-off cheap dead pixels Korean version even more, actually.
At $3,500 for the 32", the 24" will probably be $2,500-3,000. So, whether or not you will be getting 2-4 I guess depends on whether you'd rather have these monitors, or a car.
I picked up a couple Korean 27" 1440ps for the desktop for around $300 and loving them. They're even overclockable.
Yeah, Korean monitors are great if you're on a budget. My only complaint is that they are ugly as hell. Every single one I've seen. But hey, you can buy 3 1440p monitors for under a grand!
Can you provide more links to where I can buy them?
I've been looking at a higher resolution monitor than my 30" Dell and found that there are these things called "TVs" that apparently have 4K resolution but in 55" format.
I'm after screen real estate and want to see a lot of code at the same time. Does anyone have any thoughts about replacing my 30" work monitor with a 55" TV with a lot more resolution?
Although not 55", a post above refers to: http://tiamat.tsotech.com/seiki-4k
It seems like if you do not do gaming, you should be fine.