I've been experimenting with color in OpenComputers, and I have a few questions relating to how the color values I code are changed into the color values I actually get.
I'm trying to create six color schemes (three colors each) to distinguish six modes of a GUI. After planning a few schemes and programming in the color hex codes, I've found (using gpu.get()) that the colors I enter are very rarely the ones displayed on screen. Almost all colors are changed to some degree, and quite often (45% of the time in my last test, see spoiler tags) two different colors display as the same color.
I realize this has everything to do with Color Depth, of course. So I'm interested in learning how OC color depth works, and how I can choose colors that will render accurately on-screen:
Is OC's "8-bit Color Depth" equivalent to "3 bits Red, 3 bits Green, 2 bits Blue"256-color, as described in this Wikipedia article?
If so, is there a color table anywhere that lists the available colors? (I've been unable to find one).
Does the color table vary depending on the colors I've already used? In other words, is it possible that two colors may be valid when used in isolated programs by themselves, but trying to display them together merges them into one color?
Is there a way to configure Photoshop (or any other color app) to work with OC's color table? I'm familiar with Indexed Color mode, which gives you a 256-color palette, but none of the "fixed" options (Websafe, Windows System, Mac System, Spectrum, etc.) seem to fit the colors OC is giving me. I'd love some way to plan colors that will translate properly to OC.
Lastly, here's an example of a palette test program I ran that shows what I mean. It takes an array of different colors defined by hex code, and draws a square of each on the screen. Then, it checks each square with gpu.get() and identifies any repeating displayed colors with red frowny faces (as, ideally, all of the colors would be different):
E.g.1: The first red frowny-face is flagging the second square for repeating the first square. Those squares were set to 0xFFF5EE and 0xFAF0E6, respectively, but both displayed as 0xF0F0F0.
E.g. 2: The second red frowny-face flags a repeat two squares earlier---the result of 0xFFE4C4 and 0xFFDAB9 both displaying as 0xFFDBBF.
(I realize these are very close colors, and it's entirely expected that they'd render the same at 8-bit color depth---the point of this test was more to figure out why both colors were altered, and what determines which colors change, and how.)
You can post now and register later.
If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.
I've been experimenting with color in OpenComputers, and I have a few questions relating to how the color values I code are changed into the color values I actually get.
I'm trying to create six color schemes (three colors each) to distinguish six modes of a GUI. After planning a few schemes and programming in the color hex codes, I've found (using gpu.get()) that the colors I enter are very rarely the ones displayed on screen. Almost all colors are changed to some degree, and quite often (45% of the time in my last test, see spoiler tags) two different colors display as the same color.
I realize this has everything to do with Color Depth, of course. So I'm interested in learning how OC color depth works, and how I can choose colors that will render accurately on-screen:
Lastly, here's an example of a palette test program I ran that shows what I mean. It takes an array of different colors defined by hex code, and draws a square of each on the screen. Then, it checks each square with gpu.get() and identifies any repeating displayed colors with red frowny faces (as, ideally, all of the colors would be different):
E.g. 1: The first red frowny-face is flagging the second square for repeating the first square. Those squares were set to 0xFFF5EE and 0xFAF0E6, respectively, but both displayed as 0xF0F0F0.
E.g. 2: The second red frowny-face flags a repeat two squares earlier---the result of 0xFFE4C4 and 0xFFDAB9 both displaying as 0xFFDBBF.
(I realize these are very close colors, and it's entirely expected that they'd render the same at 8-bit color depth---the point of this test was more to figure out why both colors were altered, and what determines which colors change, and how.)
Thanks in advance for your help!
Link to post
Share on other sites