🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Thoughts about a Project for Experience

Started by
9 comments, last by Alberth 5 years, 1 month ago

I am interested in doing some cool projects to gain experience so that I will be able to work.  My objective was to create an emulated cash register program for a restaurant that uses SQL too and uses the WXWidgets toolkit.  I like the idea of getting some experience with GUI.  I was thinking of making a project out of it and for example have the information sent to a different window for the kitchen and log orders for viewing and have the ability to customize orders.

I am now beginning to reconsider my decision on my second real day,  I am having no difficulties.  I have learned that it isn't really possible to run a render loop with the toolkit and I am still also interested in games.  I am wondering if it is worthwhile to use this library with an SQL library for this project.  Will it suffice my goal of getting experience?  It is mostly an event driven GUI library.  Would I gain skills in the real world by using this library that seems to possibly be too limited for real usage.  Is this library used for programming in the real world?  Is it important enough to use a library with these constraints to get experience, or should I use a different library, one that is more general and is this more of a real world tool?  Or, perhaps I should see these problems and create my own GUI engine?  Would I gain enough real world skills by doing this project?  Maybe I should do this project quickly and perhaps there will always be constraints?

 

Thank you,

Josheir

Advertisement

Oh, I thought I'd mention the cash register is an emulated touchscreen type, except uses the mouse.

I think it depends on what kind of work you want to do.  Obviously, wxWidgets and other GUI libraries are used in business software, and building a game in it is a great way to get some experience.  But if by "work" you mean game development, I don't think a GUI library is necessarily the way to go.  Most game libraries offer a lot more control than GUI libraries in terms of graphics capabilities.  I've learned a TON of game dev tools, and there are basically 4 types:

  1. Low-level frameworks: These provide just enough of an API to draw graphics, handle player input (keyboard/mouse/etc.) and play sounds.  Examples for C/C++ include SDL, SFML, Allegro, PyGame (Python) and HTML5 (JavaScript).  If you're thinking 3D there's also OpenGL/WebGL, but IMO if you want to build a 3D game you'd go with...
  2. Enterprise game engines: These are big, robust tools that include "everything but the kitchen sink"; these are what the triple-A game dev companies are using (though there are probably some exceptions to that) and learning them is almost as big a task as learning game development.  The big 2 examples lately are Unity (C#) and Unreal (C/C++).  Others include Godot and Game Maker Studio (which has its own made-up scripting language).
  3. The occasional middle ground: These are few and far between; they're libraries that have more than just the basics, but aren't massive game engines.  They support things like tiles, animations, rooms/scenes and other stuff that's important for games.  I've been looking for libraries like this, and never really found one for C/C++ (which I've been assuming you want to use since you mentioned wxWidgets).  In JavaScript, there's one called Phaser that's actually pretty darn good.
  4. Libraries: All of the above are less like libraries and more like frameworks - they dictate how your code is set up (to a certain extent), and the whole game structure revolves around their way of doing things.  But you could use any combination of smaller libraries in combination, to structure the game your way.  For example, I might want to use DirectInput for input, SDL for graphics, and OpenAL for sound (I don't know why you would want that, and IMO that's a stupid and weird combo, but it's an example).  This is usually not done, except in companies that want to use their own engine instead of i.e. Unreal.  It's really not great experience in game dev, but will definitely get you in the swing of learning how to link libraries, learning from API documentation/manuals, and other practices that are VERY common in software development in general.  And actually, that's how 1 through 3 were invented - they just took a bunch of lower-level libraries and combined them in interesting ways to come up with something new.  It's overkill IMO if you're a beginner, but I figured it was still worth mention.

HTH :)

The Geek on Skates

http://www.geekonskates.com

I would say you need to think what "gain experience" means to you, what kind of experience are you looking for? Try to make it as concrete as possible, I'd say, make a list of things or so, at least for yourself to get a clearer picture.

If your goal is "I can use WxWidgets and SQL directly in my next game?", I would agree with you that the project is not very useful, unless your next game is a multi-player desktop game with lots of data.

However, I would not discard it as a waste of time. In my view, the point is not the foo-widgets toolkit or the bar-database library. foo and bar will be different every single next time. That's fine, you can learn about new foos and bars quickly once you have seen one.

The point of experimenting in another domain to me is to extend your set of tools, and to gain knowledge about strong and weak points of tools (both already known and the new ones).

 

Today, a game has a game-loop, period. Every application should have one.

You read about WxWidgets, and he, they don't have a game loop! You seem rather upset about it. but I'd suggest you ask yourself "why?", as in "why is there no game loop" and (more complicated) "why do I want one?" (or "why do games have one?" or alternatively "why don't they use event-driven programming?"). Answers to questions like that tell you more about the GUI application domain, strong and weak points of event-driven programming, and strong and weak points of game-loops, and ultimately, how event-driven programming and game-loops compare, and when to pick each solution (or not).

With that experience, you can better see the weak points of game-loops and see alternative solutions which may work better. That may lead to re-considering the "Every game should have a game-loop." premise when it looks useful, resulting in a better program, as well as a better programmer.

 

9 hours ago, The Geek on Skates said:

these are what the triple-A game dev companies are using (though there are probably some exceptions to that) and learning them is almost as big a task as learning game development.  The big 2 examples lately are Unity (C#) and Unreal (C/C++).  Others include Godot and Game Maker Studio (which has its own made-up scripting language)

So, most triple-A 3D game developers are using these Enterprise Game Engines, is this right?  What percentage make their own?

Thanks,

Josheir

 

9 hours ago, Alberth said:

If your goal is "I can use WxWidgets and SQL directly in my next game?", I would agree with you that the project is not very useful, unless your next game is a multi-player desktop game with lots of data. 

How would wxWidgets be used for a multiplayer game?

Josheir 

Those two posts were really helpful, thank you!

Josheir

14 minutes ago, Josheir said:

How would wxWidgets be used for a multiplayer game?

I always have a hard time understanding such questions.

I don't see anything special in this combination (it feels like a question like "can I use a chair to eat my dinner?", two unrelated things that are apparently connected in some way that implies trouble in some unexplained way.

Afaik, WxWidgets is a GUI toolkit, it displays some graphical form-like screen where you can interact with. "multi-player" means at least two players play the same game (probably at the same time), and interact with each other. I think you can have 2 programs running (for simplicity, let's say at two computers), both using WxWidgets for display, and connected with each other to play a game together. Anything I am missing here? Is there something specific you would want to know?

 

 

I didn't think there was a main loop which would possibly make this not possible. I also made a chat client over a network with SFML and TGUI.  

Josheir

What I'm trying to say is that checking for receiving a message would need to be in a loop, and my prior message was about this, the ineffectiveness of a (render) loop when using WXWidgets.

Josheir

8 hours ago, Josheir said:

What I'm trying to say is that checking for receiving a message would need to be in a loop, and my prior message was about this, the ineffectiveness of a (render) loop when using WXWidgets.

A desktop application is normally designed to be well-behaved among the other desktop applications. You try to reduce memory usage and CPU usage, giving room to other desktop applications to do their work. A computer running a desktop that is not being used has an idle CPU. Everything is blocked, waiting for the next action of the user. Games on the other hand, want to have as much CPU and resources as they can, to give the user the best possible experience. They boldly assume they are the only application running, and burn CPU cycles like there is no tomorrow. In other words, the basic premise of a desktop application is the complete opposite of a game.

So the basic idea of a desktop application is to block until something "interesting" happens, an event. Then you deal with the event as efficiently as you can, and you block again, waiting for the next event. You don't loop, you don't poll the network to check if something happened, you block until the next event. For a cashier application this makes a lot of sense of course. If you open the shop at 11 in the morning, and the first customer arrives at 4 in the afternoon, there is no point in wasting 5 hours CPU time checking that nothing happened and rendering the exact same display at 50fps for 5 hours. Even when in use, a user enters a handful of key-presses every 10 minutes in-between pouring and serving drinks, walking between kitchen and customers, taking the dishes back to the kitchen, and cleaning the tables for the next customer (or in your case, a handful of mouse clicks).

Can you have a networked, animated desktop application? Sure you can. You deal with the network by blocking until data arrives or until there is room to send outgoing messages. Unix has a "select" call for exactly that purpose http://man7.org/linux/man-pages/man2/select.2.html Windows likely has something similar. Some GUI toolkits have this a standard event for notification about network activity, not sure if WxWidgets has that. If not, you can always setup a separate thread that deals with the network, and pushes events to the main loop to let the main application know it should check the received messages setup by the thread.

For animations, you need some form of regular event arrivals. Some toolkits have timer events for this purpose, others have an idle-event that you can use. Of course a thread with a timer that pushes events is also an option.

Note that you have very few time guarantees, your application may be minimized, or the computer may be busy running  a different application. It's a natural result of being one of many applications running.

 

As for inefficiency, how do you define that? Above I just argued that games are highly inefficient in their CPU usage when nothing of interest is happening. With the rare occurrences of events (users and network are just plain terribly slow from a CPU point of view), event-driven updates are extremely efficient. Note that as the CPU is normally idle, when something happens, you generally do have full CPU available as well. You'd do animations at a plain canvas generally, if you go through the event system it's indeed less fast, but really, how much animation do you need to press a button? More modern toolkits integrate the GPU into them, so you can have hardware-supported rendered surfaces in the application, eg see JavaFX2 (just pointing out its existence, no real experience with it other than a bit playing with it). Likely other toolkits do something similar.

As a question to ponder on, isn't it fun that a game burns CPU cycles like a madman finding that the user or the network didn't do anything for about 50-70% of the time, and then find we are running short in CPU time elsewhere?

 

This topic is closed to new replies.

Advertisement