This is something that I've been meaning to talk about for quite some time now, but kept forgetting to blog about it and get everyone else's viewpoint on it...
Right now, the game that's been showing the most progress on Cxbx is Castlevania: Curse of Darkness. What does it do? Well, for starters, it DOES go ingame, but does have two key issues with the, and they are:
- Lack of 8-bit palettized texture support
- Incomplete and buggy vertex shader conversion.
The first one has been implemented rather well in Dxbx. I just haven't gotten around to adding it in Cxbx. This is something that I can most definitely do, since I have a basic .tga loader which I modified to support 8-bit textures and have created a very basic mini-3rd person shooter for Xbox once, so I have a means of testing it thoroughly. In short, I'm not too worried about this atm.
But what DOES worry me is the vertex shader problems. Honestly, I have absolutely NO IDEA how Cxbx converts the vertex shader to a readable form, nor do I understand how the vertex or pixel shader is converted on a real Xbox. I know it should be straight forward, but I've always had trouble grasping it.
Fixing these two things will get the game playable beyond the shadow of a doubt. Input, background music, and everything else works fine. Keep in mind that this game was ported from PS2 and still contains some coding architecture that is best optimized for PS2. Replacing the VU transformation code, all of the transformations are done in the vertex shaders. The textures are also all palettized because of the PS2's video memory limitations (only 4MB, but lots of bandwidth).
But the biggest question is, do people want this? Honestly, that's not a very big concern for me because I care more about the emu's success and compatibility than what people want to play from an immediate perspective.
So, please let me know what you all think of this. I'm really curious to know.
NeTV2 FPGA Reference Design
3 days ago