The XmlShader specification allows for multiple fragment shaders, the output of one being piped into the input of the next. Unfortunately, the existing XML parser has difficulty with dynamic xml structures (or I had no idea how to read a variable structure). So I wrote a new XML parser. It parses arbitrary XML structures and produces an easy-to-use tree.
Then I altered my existing OpenGL backend modifications to allow multi-pass rendering. This requires the framebuffer object extension, which is widely supported, but it may not be supported. I will have to add some compatibility options later.
In the meantime, almost all of the shaders in the bsnes repository work (https://gitorious.org/bsnes/xml-shaders). Some of them are quite interesting. They scale to arbitrary sizes, although some look better at certain resolutions (scale2x, 5xBr, etc.). Ctrl Alt 9 and 0 switch shaders.
Right now, I am in the middle of nowhere. I'll be back home next week, but until then I will have very limited connectivity. Feel free to send me an email or leave a comment, and I'll read and reply when I get a chance.
Singron
My personal blog.
05 August 2012
28 July 2012
Status
I've added some support for multiple shaders using XmlShaderFormat. It looks for all *.shader files and tries to parse and compile them. However there isn't a good way to change shaders. So far, all changes have been in the base OpenGL backend. However, in order to expose the shaders through the graphics mode settings, changes may possibly be made to the SDL backend. Right now, it queries a static class function to figure our the supported graphics modes. However, OpenGL calls to compile the shaders cannot succeed before the OpenGL context is created. So either the class needs to be initialized before the SDL backend queries the graphics modes, or the static class function can guess about what shaders will eventually be compiled (For instance if "CoolFilter.shader" exists in the current directory, but is not a valid XmlShader or it does not contain valid GLSL programs, then it would still be reported as a valid graphics mode.).
My work on scaler plugins has slowed down. I want to get it incorporated into the main project, so the things I need to consider are
My work on scaler plugins has slowed down. I want to get it incorporated into the main project, so the things I need to consider are
- How will the plugins work with existing backends that used the old scalers?
- How will they work with future backends?
- Do more formats need to be supported? Can they be detected at runtime or compile time (Or probably both)?
- How can they be integrated with the build system?
- Options to disable formats (e.g. 32 bit formats which are not used)
- Options to disable plugins. Currently the Edge2x/3x plugin shares the HQ scaler compile option (USE_HQ_SCALERS).
22 July 2012
Shaders
I have added some very basic support for shaders into the OpenGL backend. Right now, it looks for a "vertex.glsl" and a "fragment.glsl" to load as vertex and fragment shaders respectively. Eventually I want to have some manifest file that stores the properties of shader programs, but I have not done that yet.
Right now, two uniforms are passed to the shaders. The first is the texture to use called "texture". The second is a vec2 containing the width and height of the texture in pixels called "textureDimensions". This is useful since texture coordinates are in the range [0..1] so it is difficult to tell where one pixel ends and another starts. I have attached a shader program that implements scale2x (Advmame2x) by finding the position within a pixel using these uniforms. It duplicates the functionality of the scaler in the SDL backend, but it looks kinda funky with scale factors != 2. It can probably be modified to use a combination of Advmame3x and Advmame2x depending on subpixel position.
What is great about this is that no development tools are required to change the shader programs. So users can create and load shaders without having to configure build environments and compile ScummVM. Hopefully this can lower the barrier to making some creative works.
Advmame2x shader: https://gist.github.com/3161079
ScummVM branch with enabled shaders: https://github.com/singron/scummvm/tree/opengl
Right now, two uniforms are passed to the shaders. The first is the texture to use called "texture". The second is a vec2 containing the width and height of the texture in pixels called "textureDimensions". This is useful since texture coordinates are in the range [0..1] so it is difficult to tell where one pixel ends and another starts. I have attached a shader program that implements scale2x (Advmame2x) by finding the position within a pixel using these uniforms. It duplicates the functionality of the scaler in the SDL backend, but it looks kinda funky with scale factors != 2. It can probably be modified to use a combination of Advmame3x and Advmame2x depending on subpixel position.
What is great about this is that no development tools are required to change the shader programs. So users can create and load shaders without having to configure build environments and compile ScummVM. Hopefully this can lower the barrier to making some creative works.
Advmame2x shader: https://gist.github.com/3161079
ScummVM branch with enabled shaders: https://github.com/singron/scummvm/tree/opengl
16 July 2012
I've Been Away...
But now I'm back. I had limited time last week to do work so I have no progress to show since then. I was working on a spline interpolating filter that could be extended to be a arbitrary size scaler, but it was needlessly complicated for poor results, and I have scrapped it to work on new, more useful things.
Right now I am revising the last API addition I made (comparing the current frame to the previous frame to update only necessary pixels). It forced extra code into the backend, and with so many backends, it would get easier adoption if more of the bookkeeping was migrated to the scaler code. To simplify the addition of new plugins wishing to use this feature, the relevant code has been included in a subclass that new plugins can inherit and get all the bookkeeping for free.
Then I'll be taking a look at the OpenGL backend to implement the scalers as shaders. Hopefully with some improvements, the OpenGL backend can improve performance and quality, giving people a reason to actually use it (it is not even included in many release packages).
Right now I am revising the last API addition I made (comparing the current frame to the previous frame to update only necessary pixels). It forced extra code into the backend, and with so many backends, it would get easier adoption if more of the bookkeeping was migrated to the scaler code. To simplify the addition of new plugins wishing to use this feature, the relevant code has been included in a subclass that new plugins can inherit and get all the bookkeeping for free.
Then I'll be taking a look at the OpenGL backend to implement the scalers as shaders. Hopefully with some improvements, the OpenGL backend can improve performance and quality, giving people a reason to actually use it (it is not even included in many release packages).
06 July 2012
Edge2x/3x Finished Up
The performance of the scaler is fine now, even in debug builds without optimization. I have reimplemented the changed pixel detection through a new part of the API. The backend queries the plugin to see if it supports using an old image to compare changed pixels. One problem is that panning the screen causes the whole image to be reupdated and the mouse movements to become choppy. However, it happens rarely enough that it really is not an issue, and in optimized builds it does not matter.
I also templated the function for 32bpp support. This scaler is unique in that it uses the products of interpolation to then compare to other pixels (in other scalers, the products of interpolation are only written to the final image). The existing interpolation functions mangled the alpha channel (in the case of rgba and argb) and padding bits (in rgb888). This caused quirky image defects to happen that were trickier to track down, since the different alpha channels caused the comparisons to be different without changing the color of the pixels.
In the past, I had debugged these problems by simply returning the color red from an interpolation function. Then I would see if the broken pixels turned red. Since this scaler compared the results of the interpolation, whenever I followed a similar technique, the scaler would choose a different path, and the image would change in more chaotic ways (e.g. lines ceasing to anti-alias, black pixels appearing (but not red)). Everything at least appears to be fixed now.
Here are some sample images scaled with the 32bpp scaler.
I also templated the function for 32bpp support. This scaler is unique in that it uses the products of interpolation to then compare to other pixels (in other scalers, the products of interpolation are only written to the final image). The existing interpolation functions mangled the alpha channel (in the case of rgba and argb) and padding bits (in rgb888). This caused quirky image defects to happen that were trickier to track down, since the different alpha channels caused the comparisons to be different without changing the color of the pixels.
In the past, I had debugged these problems by simply returning the color red from an interpolation function. Then I would see if the broken pixels turned red. Since this scaler compared the results of the interpolation, whenever I followed a similar technique, the scaler would choose a different path, and the image would change in more chaotic ways (e.g. lines ceasing to anti-alias, black pixels appearing (but not red)). Everything at least appears to be fixed now.
Here are some sample images scaled with the 32bpp scaler.
Edge2x
Edge3x
02 July 2012
Clarification on Edge Scaler Optimization
I said some ambiguous statements about what optimizations I disabled in the Edge scaler. Currently, scalers update partial rectangles of the screen based on what pixels have actually changed. However this still causes lots of duplicate pixels to have the same calculations repeated on them. For fast scalers, this is good enough, but the Edge2x/3x scalers needed more speedups.
So the original scaler author included code that took these partial rectangles and tried to reconstruct the source image in the scaler. Then it diffed its own source image with future calls to find out exactly which pixels needed to be updated. However, this involves a lot of guess work about where the rectangles are in the original image. I disabled this particular optimization since the backend can more simply give this information to the scaler through a new part of the API (currently in design). Dirty rectangle updates still work just like they do with the other scalers.
The new part of the API will probably be optionally implemented for backends and scalers. The scalers will request that an old source image be kept by the backend and passed to the scaler so that the scaler can run a diff and update the pixels however it wants. This does not complicate other scalers, does not change backends that would not use the Edge scaler, and provides some needed functionality.
So the original scaler author included code that took these partial rectangles and tried to reconstruct the source image in the scaler. Then it diffed its own source image with future calls to find out exactly which pixels needed to be updated. However, this involves a lot of guess work about where the rectangles are in the original image. I disabled this particular optimization since the backend can more simply give this information to the scaler through a new part of the API (currently in design). Dirty rectangle updates still work just like they do with the other scalers.
The new part of the API will probably be optionally implemented for backends and scalers. The scalers will request that an old source image be kept by the backend and passed to the scaler so that the scaler can run a diff and update the pixels however it wants. This does not complicate other scalers, does not change backends that would not use the Edge scaler, and provides some needed functionality.
29 June 2012
Edge2x/3x scaler
So this week I added the Edge2x/3x scaler. It is the most complex scaler yet, producing the highest quality images and being the most cpu greedy.
It had some pretty ingenious heuristics to detect the entire source image, see exactly what pixels had changed and only update the relevant regions. Of course this was originally written years ago so this kind of hack is not necessary, and I have disabled it. On my laptop, everything seems smooth as long as it is compiled with optimization. Without, it is quite jerky (the compiler must be doing something right). However I am considering adding optional access to the old source image as part of the scaler api so all those hacks would not be necessary.
So here are some comparisons. Click to see larger versions.
Notice that the Edge2x scaler manages to anti alias almost every edge whereas the other scalers tend to miss some. In particular, look at the roundness of the coins and the features of Guybrush's body.
EDIT: I have clarified some of the optimizations I disabled in another post.
It had some pretty ingenious heuristics to detect the entire source image, see exactly what pixels had changed and only update the relevant regions. Of course this was originally written years ago so this kind of hack is not necessary, and I have disabled it. On my laptop, everything seems smooth as long as it is compiled with optimization. Without, it is quite jerky (the compiler must be doing something right). However I am considering adding optional access to the old source image as part of the scaler api so all those hacks would not be necessary.
So here are some comparisons. Click to see larger versions.
Normal2x
HQ2x
PM2x
Edge2x
EDIT: I have clarified some of the optimizations I disabled in another post.
Subscribe to:
Posts (Atom)