@Awoo @applejack literally the only thing keeping me from using linux wholesale at this point is that there is absolutely horrible colorblind support.
the programs that do provide filters are either laggy as hell, super unwieldy to use (like needing to have a second monitor just to see what the first one looks like with the filter), or actually simulate colorblindness.
i don't necessarily have the money to commission a whole ass program that would drop a filter over your screen right now but it is something i'd eventually like to see happen.
the only package it seems was relatively good for colorblindness is gnome-mag but it's loooong been abandoned and doesn't work anymore, plus i have no idea if it technically actually worked or not.
just using monitor or GPU color correct isn't a very elegant solution either, because doing that shifts colors in total rather than just taking specific color ranges and either amplifying or desaturating them. if i try and boost reds, for instance, i end up with grays becoming slightly red too which windows' filter doesn't do. if it does it is way too minor of a difference to notice. maybe they're doing some black magic or something but Android and iOS colorblind settings do the same thing too, just in reverse (they simulate it instead).
@beardalaxy @Awoo Programs like redshift work on Xorg and can filter the colour, idk about Wayland
I also think programs like Picom allow OpenGL shaders and can be made to work per-window
I know how to program GLSL and can look into it, though idrk what a colour blindness shader does
@beardalaxy @Awoo Basically this, it's fairly simple. Just a formula run over each pixel
@beardalaxy @Awoo Looking up "colour blindness filter" I only find colour blindness simulating filters
How do I find them?
@applejack lol ain't that the problem xD
most colorblind filters simulate it even if they're saying they help it. for instance, the majority of video games.
someone took screenshots of path of exile with all the colorblind filters for windows 10, but it doesn't exactly have the greatest array of colors to show off. i'd try and take some myself but i honestly don't know how this person got these screenshots; taking them in windows gets rid of the filter for me.
in order: normal, deuteranopia, protanopia, tritanopia.
@beardalaxy First is original. Her hair is greyish blue to green to dark blue from left to right
Eyes are light red
The filly's coat is banana yellow
The girl has normal skin colour
Second is a filter
img:map(function(r, g, b)
return math.min(r * 1.2, 1), math.max(g * 0.8, 0), b, 1
end)
@applejack i can definitely tell the red eyes are brightened up a bit, for me it goes from "probably green" to "probably red."
green in the hair is at a pretty good level.
the skin gets pretty pink though, don't know how to explain it exactly but the windows filter kind of brightens the skin while turning it to a redder hue while yours darkens it with a redder hue, if that makes any sense.
then there is the problem of the background being blue instead of white lol. you're headed in the right direction though.
i'm working on getting an example video or something here.
@beardalaxy I was kinda expecting the background bit, and the skin is just an inbetween of that. I'm thinking if I alter the strength of the filter based on how much relative redness there is it should work
@applejack here are some comparisons. one thing i don't like about windows' filter is that it blows out reds to the point where you lose gradients, but it's better than not having the filter at all. i wish i could change the strength of it.
it's worth noting that i have no idea if the protan/tritan settings are accurate, since i don't have those types of colorblindness. i'm assuming they are though, considering the deutan one is.
in order: normal, deuteranopia, protanopia, tritanopia.
@beardalaxy After playing around with some simple curves and ideas I got closer but not exact
Now I found this https://www.researchgate.net/publication/326626897_Smartphone_Based_Image_Color_Correction_for_Color_Blindness#pf3
And through that found this. It lists the algorithms
http://www.daltonize.org/2010/05/lms-daltonization-algorithm.html
@beardalaxy Okay, I got the python to run and hooooly shit it's slow. My Lua script using my image library took 1.09s and the python one using numpy took 35.49s
Python output looks the same as with the Lua script. It says explicitly that it's to make it better