Why 32 bit color
But in essence the problem is the hardware, we can't produce a higher bit depth monitor yet. Although this does make me think that a design for a 4 or 5 colour monitor with 10 bit on each channel might really shake the market up as it will look fantastic in comparison to the current crop of monitors.
Last edited: Jun 23, Kippa Senior member. Dec 12, 1 Don't the mainstream and high end 4K monitors support 10bit depth not counting the ultra low end? Bubbleawsome Diamond Member. Apr 14, 4, 1, What is causing this limitation? Bandwidth, parts, or has no one really tried? Kippa said:. Hi-Fi Man Senior member. Oct 19, Mand Senior member.
Jan 13, 0 0. Mostly, it's that we can't control the gradations in the LCDs finely enough to get more bit depth. Most TN panels are 6-bit with dithering. Most IPS displays aimed at professional use are 8-bit, but some of them have bit color levels. But you're really hitting the limit of LCDs at that point.
The next step up would be bit color, which is levels. That's well beyond what we can control the LCDs to accurately. Bandwidth over the display cables and through the GPU is another matter entirely, but that's trivial compared to actually displaying the levels required. BrightCandle said:. I don't know of any 4k monitors that have been released that are even remotely 10 bit. Jun 4, 1, 7 Enigmoid said:.
Last edited by a moderator: Jun 23, Jun 12, 0 Mand said:. Mark R Diamond Member. Oct 9, 8, 13 High end monitors for professional use e. The difference in performance between these and regular monitors is very difficult to spot, but there can be slight increases in banding. The issue becomes more apparent with "wide gamut" monitors. A "wide gamut" monitor can display a much richer selection of colors, but at 8 bit, it still only has This can be very objectionable if you are doing software or GPU color correction to emulate a narrow gamut, such as sRGB.
The real limiting factor is lack of software support, most OSs don't really offer any useful "30 bit" support. Most 30 bit cards basically use driver tricks. The card runs in a 30 bit mode, but emulates a 24 bit mode for the OS - so when the OS sends data to the card, the hardware translates it to 30 bit mode, and vice versa. However, when applications use hardware accelerated functions, the card actually renders in 30 bit direct to VRAM. Replies 10 Views Oct 1, Sausagemeat. Need advice on replacing a Windows 10 bit installation with a bit installation.
Replies 8 Views 3K. Oct 26, Kshipper. Replies 21 Views Oct 20, Yenega. Latest posts. Kyle Rittenhouse defense claims Apple's 'AI' manipulates footage when using pinch-to-zoom lripplinger replied 2 minutes ago. Congress will require automakers to implement anti-drunk driving tech in vehicles Austinturner replied 9 minutes ago. YouTube is removing 'dislike' counter from all videos lukaspechar replied 12 minutes ago. Patreon is working on its own video hosting platform to combat YouTube's dominance Dimitrios replied 17 minutes ago.
Anyway for editting the images it certainly makes sense to have more data. As otherwise your edit might make the limitation visible. If height mapping is part of your job, then the answer is no. By the time higher color depth becomes standard height maps will probably be a thing of the past though. Info on 8-bit 24bpp Vs 10 32bpp bit monitor and gpu. Also professional graphics vs consumer. Honestly this matters more for photographers and videographers for professional graphics work than others like design or illustration.
It helps with color grading and gradient banning. A good monitor calibrator and 24bpp can go along way for graphic design. People who own a consumer grade or better bit monitor actually have already had support from AMD Radeon gpu's for years and as of Most Gtx gpu's support it as well.
Windows 10 and macOS both just implemented native bit - aka 32bpp - support, with linux having it a few years already. All three OS still have their own issues relating to 32bpp support. For games, video, and other applications 32bpp support really depends on the developer and the color standards individuals publish content at, many of the latest games in theory support 32bpp color already. You should know that exteneding the 32bpp color range of a monitor will not mean that that extended color range will be in the desiered color gamut.
I recommend reviewing the monitor your looking at on tomshardware. It is generally recommended that you get a 27in or larger monitor if your going to run a single monitor setup, and as high a resolution as you can afford.
You'll want one that is a 60hz or better refresh rate with a response time of under 10ms for daily workflow. Professional applications won't work with GTX gpu as of or earlier for 32bpp color. Quadro Gpu will support 32bpp, mainly starting from or later please review each spec sheet for said Quadro as I do know their general support for 32bpp color started later than with Radeons or FirePros.
Quadro are workstation class gpu but have less hardware power per dollar then the GTX line up, so for graphic design areas other than 32bpp they may not be the best solution.
FirePro gpu - or the radeon WX series - have long had 32bpp support even though computers for the must part until now haven been able to benefit from it. FirePro are workstation workstation class gpu but have less hardware power per dollar then the regular radeon line up, so for areas other than 32bpp they may not be the best solution for graphic designers.
Radeon gpu use opencl or vulkan as standard in their drivers and some can be modified to use 32bpp with professional applications, but some can not. In theory 32bpp in genereal has been supported in the Radeon gpu line up for years now.
Either adobe or AMD will not now allow Radeon drivers to work with the adobe suite of applications even though in theory it is natively supported by the gpu in the first place. However if you can infact find one that works for 32bpp in adobe and has been recently released, it is probably the best value per dollar. Note the Quardo and FirePro are better for higher precision for 3d printing and other high precision areas like medical imaging.
For Cpus more cores are better for multitasking or anything to do with video. For non-video adobe programs they only use a few cores mainly so the faster frequency matters more, but only to a small degree. For RAM you'll want a minimum of 16GB for professional use, 32GB or more can help for video editing but not as big of a difference as you'd see compared to anything under 16GB.
Just plan it based off your motherboard. For memory storage You'll probably store your finished images or videos on a HHD for a larger area to store stuff like working files, photos, or videos since they have a better price per GB. For Cases and Cooling Most professionals prefer to buy cases and cooling based off how quiet they are as well as their cooling abilities before anything else. Noctua are by far the quietest cpu coolers and fans for the money, the NH-D14 is one of the best cheap cpu coolers on the market.
The fractal Design R5 is claimed to be the quietest case as of November 1st Sign up to join this community. The best answers are voted up and rise to the top. Stack Overflow for Teams — Collaborate and share knowledge with a private group.
Create a free Team What is Teams? Learn more. Is bit color depth enough? Ask Question. Asked 6 years, 9 months ago.
0コメント