EQEmulator Forums

EQEmulator Forums (https://www.eqemulator.org/forums/index.php)
-   General::General Discussion (https://www.eqemulator.org/forums/forumdisplay.php?f=586)
-   -   Anyone interested in client development? (https://www.eqemulator.org/forums/showthread.php?t=33432)

PixelEngineer 08-26-2012 08:40 PM

Quote:

Originally Posted by KLS (Post 212066)
Both Lion and Mountain Lion both support up to OpenGL 3.2 with GLSL 1.5.

It's honestly pretty shitty support as 3.3 has been out for over two years and they don't support any of the 4.x profiles (derp) but not quite as bad as being stuck on 2.1.

http://i.imgur.com/Gx3Ot.png

http://i.imgur.com/gW54Y.png

KLS 08-27-2012 01:16 PM

https://developer.apple.com/graphics...1074_Core.html

Something is messed up with your setup.

PixelEngineer 08-27-2012 04:32 PM

Quote:

Originally Posted by KLS (Post 212087)
https://developer.apple.com/graphics...1074_Core.html

Something is messed up with your setup.

Good find. Not sure why that's happening.

KLS 08-27-2012 09:09 PM

I think i read somewhere the earlier drivers with the 10.7 update didn't support HD3000 very well or something. Perhaps it has something to do with that? Don't know really but in theory you should have GL3.2 support (which is roughly DX10 level capabilities).

cavedude 08-29-2012 07:24 PM

Found you on Reddit ;) http://www.reddit.com/r/everquest/co...r_my_original/

PixelEngineer 08-31-2012 07:16 PM

Quote:

Originally Posted by cavedude (Post 212125)

Yes you did!

Transparency:

http://s13.postimage.org/5b4izmqgl/S...4_10_17_PM.png

Caryatis 08-31-2012 09:17 PM

That link is hilarious.

One guy been working a year on some screenshots and thinks being able to fly around will be awesome(maybe eqemu should support that? o wait)

Everybody else thinking that just because you can support new features, that it won't take a skilled artist to produce the things in their heads. Nothing against you Pixel, the engine is definitely an awesome project.

Tabasco 08-31-2012 10:15 PM

It would probably be more valuable to make a client that can authenticate and enter a world first. From there you can just use placeholders as you develop your modelling pipeline. At that point you have the real beginnings of a functional client instead of a model viewer.

The work on the various file formats is excellent, but considering that eqemu is probably the most successful open source MMO, a client that doesn't have a bunch of non-free attachment or lawsuit risk would be incredible.
Asset creation is daunting but at the point you have a cube running around inside a box and can see and interact with other cubes, community involvement would probably take over. You could even borrow quite a bit from a place like opengameart.org. Asset creation is hard, but it's a process, and if I can model and rig a character in blender, anyone can.

PixelEngineer 08-31-2012 11:17 PM

Quote:

Originally Posted by Tabasco (Post 212179)
It would probably be more valuable to make a client that can authenticate and enter a world first. From there you can just use placeholders as you develop your modelling pipeline. At that point you have the real beginnings of a functional client instead of a model viewer.

The work on the various file formats is excellent, but considering that eqemu is probably the most successful open source MMO, a client that doesn't have a bunch of non-free attachment or lawsuit risk would be incredible.
Asset creation is daunting but at the point you have a cube running around inside a box and can see and interact with other cubes, community involvement would probably take over. You could even borrow quite a bit from a place like opengameart.org. Asset creation is hard, but it's a process, and if I can model and rig a character in blender, anyone can.

If I am not mistaken, this was the basis for Windcatcher's Simple Client. That project was fantastic but I would really love to support the actual zones of EverQuest and I don't think it would be legally problematic unless copyrighted materials were modified.

Take a look at all of these open source engine recreations: http://en.wikipedia.org/wiki/Game_engine_recreation

I doubt very many of them have run into legal trouble. That being said, I am really not that far away from the transition between zone viewer to client. I realize how much work will need to get done for it to be a full blown client but it is still a goal will work towards. I can't wait for the day when I can stop working on the graphics side and focus on the actual client game programming.

My posting the link to reddit wasn't the greatest idea (especially on my actual account) but I was proud of what I had and wanted to share. I think people there are hopeful for what this project can really be, as am I. Regardless, I want my contribution to this community to be an open source client that people can use for whatever they want.

Cheers

Tabasco 09-01-2012 12:25 AM

You've got plenty to be proud of, the engine is good work. It's not precisely open source at this point, but that's your call.

We have no way of knowing whether or not SOE will pull a page from Blizzard's playbook, but that's not the point. Eqemu is a server framework that can be devoid of non-free content but it's bound to a non-free, evolving client.
Windcatcher's client was in delphi and never worked without a custom login server as far as I know, so it's a pretty poor example.
I just hate to see a promising project consumed by the hassles of backward compatibility with a client that already does a pretty good job of representing the nostalgia of classic EverQuest.

rhyotte 09-01-2012 01:45 AM

Cannot wait for the open source linux client!

PixelEngineer 09-01-2012 04:44 AM

Quote:

Originally Posted by Tabasco (Post 212191)
You've got plenty to be proud of, the engine is good work. It's not precisely open source at this point, but that's your call.

We have no way of knowing whether or not SOE will pull a page from Blizzard's playbook, but that's not the point. Eqemu is a server framework that can be devoid of non-free content but it's bound to a non-free, evolving client.
Windcatcher's client was in delphi and never worked without a custom login server as far as I know, so it's a pretty poor example.
I just hate to see a promising project consumed by the hassles of backward compatibility with a client that already does a pretty good job of representing the nostalgia of classic EverQuest.

The client isn't open source because I am ironing out the ugly details in my code. Any programmer of an open source program knows that it is often a daunting process of sending your code out for public scrutiny.

I understand the concerns of backwards compatibility but here's my thoughts:
- I want this project to be compatible with old hardware.
- I want to have this client run as fast as possible.
- I want this client to run on all platforms (including Android and iOS)

and most importantly,

- I want this project to teach me about graphics and game programming.

That's about it. I do appreciate the feedback and criticisms.

Cheers

PiB 09-11-2012 03:16 PM

First poster here, also working on re-creating a EQ client using OpenGL. Seems to be a popular pastime here :)

PixelEngineer, are you using backface culling? I have tried to turn it on but this removes a lot of faces that should be visible:

http://i.imgur.com/OXNLa.jpg

When backface culling is disabled:

http://i.imgur.com/FkXpd.jpg

Looks like the normals and winding order of many faces are wrong in the s3d files. Or maybe there's some flag to invert them that I overlooked? In Blender, the normals of the Pedestal are pointing down instead of up:

http://i.imgur.com/b42rw.jpg

I could just leave it disabled but this sometimes causes z-fighting (look at the water in the second image).

KLS 09-11-2012 03:34 PM

If you know the order of the vertices you can just calculate your own normals (hint: you do!) which is what we do in azone for .map files.

PiB 09-12-2012 04:17 AM

I know how to calculate face normals, but I think the order of the vertices in the file is wrong. Half of the faces use clockwise order (first picture) while the other half use anti-clockwise order (second picture).

The reason I mentioned the normals is that, if all the normals were pointing in the right direction I could detect which faces have the wrong order and change it. Unfortunately it seems faces specified clockwise have wrong normals and vice versa.

http://i.imgur.com/Joy3H.jpg

http://i.imgur.com/JtH69.jpg

PiB 09-12-2012 10:58 AM

My bad, I was looking at the faces from the wrong direction. Looks like they are all specified in clockwise order. Asking OpenGL to cull front faces did the trick. D'oh!

PixelEngineer 09-12-2012 11:02 PM

The zones were originally written for rendering with DirectX and it differs from OpenGL on the direction of the Z axis. Scaling your model view projection matrix by -1 in the Z direction. Then you need to make the front face wind clockwise and cull the back face. That should do it.

Are you using Blender for viewing the models or are you using it as your graphics engine?

The more projects the merrier. If you are interested in helping, I will hustle and get this github. Regardless, the more information that is out there, the better.

Cheers

PiB 09-13-2012 04:27 AM

I was flipping Z before but not setting the winding order so that's why I was surprised when half of the scene disappeared when I tried to turn on face culling! It seems to work fine now.

I started this project by writing a Python script to load WLD data like zones, zone objects, characters, skeletons etc and import it to Blender as a kind of prototype. When this worked well enough I rewrote the code in C++ and used OpenGL. But I kept the scripts around which can be handy for debugging sometimes.

The more the merrier, I agree! I will upload this code to GitHub too so we can share the code and information.

There are some features you mentioned I haven't got around to do yet (properly supporting transparency, animated textures, minor things like sky box...). How is your work on animations/skeletons going? I am currently trying out different ideas for implementing lighting, it's not as straightforward as I thought it would be.

rhyotte 09-13-2012 05:42 AM

This is getting better and better! Yaay! Can not wait till I can drop windows and run just linux and still be able to play EQEMU natively :)

PixelEngineer 09-13-2012 09:38 AM

Quote:

Originally Posted by PiB (Post 212489)
I was flipping Z before but not setting the winding order so that's why I was surprised when half of the scene disappeared when I tried to turn on face culling! It seems to work fine now.

I started this project by writing a Python script to load WLD data like zones, zone objects, characters, skeletons etc and import it to Blender as a kind of prototype. When this worked well enough I rewrote the code in C++ and used OpenGL. But I kept the scripts around which can be handy for debugging sometimes.

The more the merrier, I agree! I will upload this code to GitHub too so we can share the code and information.

There are some features you mentioned I haven't got around to do yet (properly supporting transparency, animated textures, minor things like sky box...). How is your work on animations/skeletons going? I am currently trying out different ideas for implementing lighting, it's not as straightforward as I thought it would be.

I am currently working on animations/skeletons. I have not run into any problems. It's just a matter of getting all of the fragments loaded in a way that they can be used quickly when rendering.

For transparency, you really need to use the BSP tree while rendering. I assume you could get away without it but it would be much more work. I render every visible surface that is in the PVS and frustum recursively going front to back to prevent overdraw. Every time I come across a batch of polygons that are transparent, I add the offset and information to my "transparency stack". I chose the stack because you need to render back to front with transparency and a stack is an ideal data structure given the order of entry while rendering front to back.

For animated textures, make sure you have created a texture data structure that supports numerous bitmaps. One of the unknowns in the 0x04 fragments is the millisecond delay between texture switching. Keep track of the time and if it goes over the amount in the delay, switch index of the bitmap you will use for that texture.

The skybox was a bit more tricky. Just picture someone walking around with a dome around their head. Clear the depth buffer and render as usual. I can elaborate on this if needed.

As for lighting, I have implemented just the zone lighting that was in the original zones. Instead of dynamic lighting or lightmaps like Quake 3, they simply shaded the polygons with the color of nearby lightsources at each vertex. They essentially "faked" the lighting.

I will work towards getting my code on github this week as well. Probably getting static player/NPC models loaded would be a good place for me to take a small break. Let me know if you have any other questions in the meantime.

Cheers!

PiB 09-14-2012 04:26 AM

Quote:

Originally Posted by PixelEngineer (Post 212491)
I am currently working on animations/skeletons. I have not run into any problems. It's just a matter of getting all of the fragments loaded in a way that they can be used quickly when rendering.

I think one issue you will probably run into is that many characters have few or no animations you can load through 0x14 fragments. I am pretty sure the reason was to save space instead of duplicating animations. Many characters seem to share animations. For examples, barbarians, dark/high/half elves, erudites and humans all use wood elf animations with some additions (see video). Same for dragons and quite a lot of mobs. I have made a list of the most common vanilla/Kunark/Velious characters I could find and which animations they use.

Quote:

Originally Posted by PixelEngineer (Post 212491)
For transparency, you really need to use the BSP tree while rendering. I assume you could get away without it but it would be much more work. I render every visible surface that is in the PVS and frustum recursively going front to back to prevent overdraw. Every time I come across a batch of polygons that are transparent, I add the offset and information to my "transparency stack". I chose the stack because you need to render back to front with transparency and a stack is an ideal data structure given the order of entry while rendering front to back.

How do you sort from front to back, do you do it per object/region using its AABB or per face? Or can you traverse the BSP front-to-back somehow? I thought the division between planes were arbitrary. Anyway that's one more thing I have to implement, right now I'm using an octree and frustum culling for this. I guess this is not the most efficient. But it will probably come in handy for keeping track of characters. One thing I was wondering, isn't the usefulness of the PVS limited in outdoor zones like the Karanas where you can see from very far away? Obviously I'm sure this works pretty well in dungeons.

Quote:

Originally Posted by PixelEngineer (Post 212491)
For animated textures, make sure you have created a texture data structure that supports numerous bitmaps. One of the unknowns in the 0x04 fragments is the millisecond delay between texture switching. Keep track of the time and if it goes over the amount in the delay, switch index of the bitmap you will use for that texture.

I am using 2D texture arrays for textures so that I can draw static objects (and eventually characters) in one draw call. This should make it straightforward to have animated textures. Each vertex has a third texture coordinate for the layer in the array. I plan to add a kind of offset table in the vertex shader so that you can specify alternate textures (if animated, or to have different kind of equipments) without changing texture bindings.

Quote:

Originally Posted by PixelEngineer (Post 212491)
The skybox was a bit more tricky. Just picture someone walking around with a dome around their head. Clear the depth buffer and render as usual. I can elaborate on this if needed.

How did you determine the scale of the dome? Do you use some kind of scaling factor that you multiply with the width/length of the zone?

Quote:

Originally Posted by PixelEngineer (Post 212491)
As for lighting, I have implemented just the zone lighting that was in the original zones. Instead of dynamic lighting or lightmaps like Quake 3, they simply shaded the polygons with the color of nearby lightsources at each vertex. They essentially "faked" the lighting.

I think I have tried something similar. I started with determining which lights affect an object and send the array of lights used in the vertex shader for the object. This approach didn't scale very well with a lot of lights. I tried to compute the per-vertex lighting once, when loading the zone files. Then I didn't have to do any lighting in shaders but the result was quite ugly (since it's done per-vertex and old zones have very large polygons). I will try deferred shading next (so I can do per-fragment lighting in one pass) but I think this will be quite a lot of work.

Quote:

Originally Posted by PixelEngineer (Post 212491)
I will work towards getting my code on github this week as well. Probably getting static player/NPC models loaded would be a good place for me to take a small break. Let me know if you have any other questions in the meantime

Sound good, keep up the good work!

PixelEngineer 09-14-2012 09:39 PM

Quote:

Originally Posted by PiB (Post 212501)
I think one issue you will probably run into is that many characters have few or no animations you can load through 0x14 fragments. I am pretty sure the reason was to save space instead of duplicating animations. Many characters seem to share animations. For examples, barbarians, dark/high/half elves, erudites and humans all use wood elf animations with some additions (see video). Same for dragons and quite a lot of mobs. I have made a list of the most common vanilla/Kunark/Velious characters I could find and which animations they use.

Is that your program? Nice work. Do you have a copy of the source?

Quote:

Originally Posted by PiB (Post 212501)
How do you sort from front to back, do you do it per object/region using its AABB or per face? Or can you traverse the BSP front-to-back somehow? I thought the division between planes were arbitrary. Anyway that's one more thing I have to implement, right now I'm using an octree and frustum culling for this. I guess this is not the most efficient. But it will probably come in handy for keeping track of characters. One thing I was wondering, isn't the usefulness of the PVS limited in outdoor zones like the Karanas where you can see from very far away? Obviously I'm sure this works pretty well in dungeons.

The BSP tree has done pretty much all of the work for you. A BSP tree is made up of arbitrarily sized regions divided by split planes. Things in front of each split plane are found on the left side of each tree node and things behind it will be found on the right. The way to correctly render front to back is to recursively iterate the tree visiting the child nodes the camera is in front of first.

In terms of rendering transparency back to front, as I mentioned, I use a stack. It holds the offset in my VBO as well as the number of polygons. Because a stack is a last in first out data structure when I render front to back the polygon batches that go in come out last.

Here is some tree traversal code demonstrating what happens:

Code:


    // Calculate the distance to the split plane
    float distance = (camera.getX() * tree[node].normal[0])
+ (camera.getY() * tree[node].normal[1]) + (camera.getZ() * tree[node].normal[2]) + tree[node].splitdistance;

    // We are not at a leaf
    if (distance > 0)
    {
        renderGeometry(cameraMat, tree[node].left, curRegion);
        renderGeometry(cameraMat, tree[node].right, curRegion);
    }
    else
    {
        renderGeometry(cameraMat, tree[node].right, curRegion);
        renderGeometry(cameraMat, tree[node].left, curRegion);
    }

I suppose you can use an octree but you will still have to take the BSP tree into consideration for information about region types (water, lava, PvP) and if you want to use the PVS. It is true that the PVS is pretty inefficient at reducing a lot of the regions you can't see but it's a very inexpensive check to do.

Quote:

Originally Posted by PiB (Post 212501)
How did you determine the scale of the dome? Do you use some kind of scaling factor that you multiply with the width/length of the zone?

I think you are misunderstanding what skydomes really are. EverQuest's skydomes are small half spheres that translate and rotate with the camera. They are drawn first and give the impression that the sky is very large despite it being exactly the opposite. Picture someone walking around with a sky textured bowl on their head. This is essentially the idea and because it moves with them, it gives the illusion that the sky is vastly infinite. If you were to stretch the skybox over the entire zone and walk the distance, you could notice approaching the edge of the diameter and it would look a bit weird.

The first thing you should render is the skydome. Then, clear the depth buffer. Because the skydome is inches away from the camera if you didn't clear the depth buffer, none of the zone would render because it's all further away than the skydome actually is. After clearing the depth buffer, render as you usually do. It will give the illusions that behind everything rendered is a vast sky.

Quote:

Originally Posted by PiB (Post 212501)
I think I have tried something similar. I started with determining which lights affect an object and send the array of lights used in the vertex shader for the object. This approach didn't scale very well with a lot of lights. I tried to compute the per-vertex lighting once, when loading the zone files. Then I didn't have to do any lighting in shaders but the result was quite ugly (since it's done per-vertex and old zones have very large polygons). I will try deferred shading next (so I can do per-fragment lighting in one pass) but I think this will be quite a lot of work.

First, determine what your goal is. My goal is have the zones rendering as close to classic EverQuest as possible. The original lighting was simply precomputed vertex colors which are blended with the textures to give the appearance of lighting. Objects also have vertex colors as the EverQuest client did not dynamically shade objects. I assume the lights.wld contains lighting details just for shading player and mob models.

After I have everything rendering, I will move on to per pixel lighting. You are correct that per pixel lighting with the provided surface normals will not look good at all. You really need to be utilizing normal maps for any surface that is rendered with phong shading.

PiB 09-15-2012 08:38 PM

Quote:

Originally Posted by PixelEngineer (Post 212504)
Is that your program? Nice work. Do you have a copy of the source?

It is. Thanks! I pushed my local Git repo to Github if you want to take a look. This should build on both Windows and Linux (haven't tested MacOS X) using CMake. I really should add a README with build instructions.

Quote:

Originally Posted by PixelEngineer (Post 212504)
The BSP tree has done pretty much all of the work for you. A BSP tree is made up of arbitrarily sized regions divided by split planes. Things in front of each split plane are found on the left side of each tree node and things behind it will be found on the right. The way to correctly render front to back is to recursively iterate the tree visiting the child nodes the camera is in front of first.

I didn't know that the planes in the BSP tree always split the space into front / back half-spaces. Thanks for the clarification.

Quote:

Originally Posted by PixelEngineer (Post 212504)
In terms of rendering transparency back to front, as I mentioned, I use a stack. It holds the offset in my VBO as well as the number of polygons. Because a stack is a last in first out data structure when I render front to back the polygon batches that go in come out last.

Here is some tree traversal code demonstrating what happens:

...

I suppose you can use an octree but you will still have to take the BSP tree into consideration for information about region types (water, lava, PvP) and if you want to use the PVS. It is true that the PVS is pretty inefficient at reducing a lot of the regions you can't see but it's a very inexpensive check to do.

Yeah, I'm working on using this BSP tree now. This should be faster to traverse than my octree and I need it to determine the ambient light of the region anyway (even though many zones seem to have the same ambient light in all regions). The traversal code looks really straightforward, thanks.

Quote:

Originally Posted by PixelEngineer (Post 212504)
I think you are misunderstanding what skydomes really are. EverQuest's skydomes are small half spheres that translate and rotate with the camera. They are drawn first and give the impression that the sky is very large despite it being exactly the opposite. Picture someone walking around with a sky textured bowl on their head. This is essentially the idea and because it moves with them, it gives the illusion that the sky is vastly infinite. If you were to stretch the skybox over the entire zone and walk the distance, you could notice approaching the edge of the diameter and it would look a bit weird.

The first thing you should render is the skydome. Then, clear the depth buffer. Because the skydome is inches away from the camera if you didn't clear the depth buffer, none of the zone would render because it's all further away than the skydome actually is. After clearing the depth buffer, render as you usually do. It will give the illusions that behind everything rendered is a vast sky.

Ah, that makes sense. I think I understand now. Does this mean you render the sky box in camera space?

ankhamunn 11-02-2012 09:38 PM

Hey guys. I was able to get VS2010 and AnkhSVN working and was able to even get the code compiled and running. It's a pretty big thrill when you're able to open gfay and have a look around in a completely separate client, compiled on your own machine.

Is there still being work done on this? PE - I sent you a PM and an email but haven't heard from you yet.

Also - I'm not very familiar with building/branching public SVN projects but I'm reading a book that highly recommends organizing any game development code into several folders: Docs, Media, Source, Obj, Bin, Test. Any thoughts on doing this early in the process (before the client gets more elaborate than it is now)?

Nice work though - I'm really excited to have this resource. Also, huge kudos for commenting and organizing your code such that someone can jump into it and understand, basically, what's going on! I have little/no experience with C++ or graphics in C but you've put together a project that makes sense to me - VERY Exciting!

ankhamunn 11-03-2012 11:32 PM

So I've spent more time tinkering with the Assembla repo and VS settings and realized that I can do all of the custom folder structure stuff on my local computer by myself - Silly me! Apologies - I've never used VS before so still getting used to all of it.

I also took the time tonight to read through some of the earlier posts in depth (rather than skimming through). Any word on when you'd feel comfortable releasing the code on github? I figure the assembla code would be good to get a leg up on what's going on but would it be worth working on this code at all?

PixelEngineer 11-11-2012 11:20 PM

I am still working on this client, albeit slowly. I have something super interesting I am working on. Still classic but will definitely change the way EverQuest is viewed. I will let everyone know when I am done.

Thanks for the interest in developing. I know there is another developer around working on some things as well. It's pretty cool as he's pretty much got animations figured out and I was able to help with other stuff I had already finished.

I will update soon.

rhyotte 11-12-2012 04:05 AM

Friggen TEASE !!

ankhamunn 11-12-2012 09:02 PM

No kidding ;) Looking forward to seeing the progress!

KLS 11-13-2012 06:56 PM

Un-sticking this for now as it includes more closed source development than current open source development. Which is fine but not deserving of a sticky post in a FOSS development forum any longer.

Tyen05 01-13-2013 03:45 PM

Animations/models/textures all extracted.

http://bit.ly/X5VUln

Tyen05 01-19-2013 06:44 AM

Example - animations in unity

http://bit.ly/VAqjuG

Amer 01-19-2013 07:03 PM

Awesome Tyen! What's your next steps?

bristle 01-24-2013 06:56 PM

Quote:

Originally Posted by Tyen05 (Post 216327)
Animations/models/textures all extracted.

http://bit.ly/X5VUln

took one of the models and convert fbx to dae and imported to opensim. then i converted it to obj and import to zbrush. i am sure i could do it for blender too and it has fbx export.

so in theory, i could convert fbx to collada. import to blender. add more animations and then export back to fbx.

i dont know what to do after that for EQ.

bristle 01-25-2013 12:16 AM

well good luck. i am going back to opensim.

bristle 01-26-2013 11:40 PM

i was going to write about 3rd party sl client and opensim and EQEmu but i am not ready. right now, i can do everything i need in unity and a fake server. then i can transfer in a 3rd party sl client in c++.


All times are GMT -4. The time now is 07:14 PM.

Powered by vBulletin®, Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.