🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Steering a big ship

Published March 20, 2010
Advertisement
I've put a lot of thought into this over the past couple of weeks, and it's taken quite a bit of waffling on my part to settle on a course of action. I'm posting this largely as a way to force myself to commit to a plan, moreso than simply putting the information out there.

Epoch, as a project, is going to shift directions.

Up until now my big focus has been on getting the parallel processing side of things up to speed. All of the hype and most of the prose on the project web site has, until this point, centered around the idea of a language that can do really amazing things with automatically running code on the most appropriate level of hardware.

I now consider that aspect of the language to be "done," not in the sense that it is complete and needs no further work, but in the sense that I've accomplished what I set out to do, which is to build a proof-of-concept that indicates that my ideas of hint-based execution relocation are sound and can be used for real computational purposes. Yes, the demo is a trivial vector addition, but the cross-compiler is rich enough to permit just about anything you could want to cram onto a GPU (at least via CUDA). If I took the time to wire up some of CUDA's built-in library functions, it'd be a very, very solid platform.


I have always presented Epoch as a language whose time has not yet come. Even at the very beginning, when the seeds of the Foo Language started to sprout here in my journal, I knew that I was working on a problem that hasn't actually become a problem yet. My original goal was to have a solution ready, so that when the problem truly started causing issues for real-world development, Epoch would be there as a natural and easy choice for transitioning into a new development paradigm.

The problem that originally inspired Epoch's creation was, to put it bluntly, that C++ sucks for writing large-scale software. Embedded scripting languages are the most visible symptom of this issue. DSLs abound nowadays, and rightly so - but the infrastructure on which domain-specific systems are built is shaky, to say the least.


One of the main reasons I enjoy GDC is getting a sort of pulse of the industry and how people feel about our toolkits. There seems to be a lot of consensus that C++ is in fact a horrid language, and most people who have reached that conclusion are actively casting about for something more suitable. One of the big concepts that sparked Epoch in the beginning was doing a language that inherently invited programmers to build DSLs; and the single most oft-repeated wish I heard at GDC this year was for a language that could do exactly that.


Programmers are ready for a better way to work. The climate still isn't quite perfect for a mass language switch-over, but that's fine; it'll take some time to posture Epoch so that it is ready to step into that gap once it opens.

The way I see it, Epoch needs three things now, roughly in order of how I will work on them:

  • A truly rich meta-programming system for building and working with DSLs within the context of the language itself

  • A proper object model for improving the language's domain representation capabilities

  • A rich set of tools for doing real-world development in Epoch



My goal at this point is to reimplement the existing Epoch language in terms of itself, using metaprogramming features. I've already started redefining the grammar and retooling the parser to work on this principle; it may affect the way that syntax looks, but I'm more than willing to live with that. Once Epoch is rebuilt, I will have the metaprogramming tools at hand to build the object model precisely the way I want to (which I'm starting to get really excited about - stay tuned for details on how I plan to build object modelling into the language).

Lastly, and possibly most critically, once the domain-modelling stuff is in place, I can start working on building an IDE for the language itself, which will hopefully bring us up to a level where exploring the language is a low-risk venture for newcomers.


So there's the big spiel; I'm still working through whatever nasty infection I picked up in my travels, so don't expect much out of me for the next couple of days. (But yes, I am still working on GDC coverage and will start getting that posted ASAP.)
Previous Entry GDC 2010 Postmortem
Next Entry Zzzzzzzz...
0 likes 2 comments

Comments

Ng
Looks like exciting stuff, keep up the good work!

On the C++ note, I used to like it more than I do today, even though I've always had a "use-only-the-language-features-I-need-for-the-solution" philosophy. After being in close contact to bad C++ code for some years, I've been enjoying the blissful programming experience of pure ANSI C for my solutions.

For a down-to-earth example, (more) generic programming in pure C uses void pointers and function pointers abound, but that's cool, because it's about all the language offers. If you want to do it in C++, there are better ways than void*'s and such, _but_ there's a can of worms waiting around the corner. There are cans and worms in C, but they can fit in my head all at once; if you can do that with C++, pray tell!

I'm rambling. Back to topic...

I've recently read some posts from Greg Pfister's The Perils of Parallel, including one on parallel languages (it's a series of 3 posts, actually). This comes to mind when I think about your work (from the blog):

Quote: A key reason, and probably the key reason, is application longevity and portability. Applications outlive any given generation of hardware, so developers must create them in languages they are sure will be available over time - which is impossible except for broadly used, standard languages. You can't assume you can just create a new compiler for wonderful language X on the next hardware generation, since good compilers are very expensive to develop. (I first proposed this as a reason; there was repeated agreement.) That longevity issue particularly scares independent software vendors (ISVs). (Pointed out by John D. McCalpin, from his experience at SGI.)

The investment issue isn't just for developing a compiler; languages are just one link in an ecosystem. All links needed for success. A quote: "the perfect parallel programming language will not succeed [without, at introduction] effective compilers on a wide range of platforms,... massive support for re-writing applications," analysis, debugging, databases, etc." (Horst Simon, Lawrence Berkeley Laboratory Associate Lab Director CS; from a 2004 NRC report.)

Users are highly averse to using new languages. They have invested a lot of time and effort in the skills they use writing programs in existing languages, and as a result are reluctant to change. "I don't know what the dominant language for scientific programming will be in 10 years, but I do know it will be called Fortran." (quoted by Allan Gottlieb, NYU). And there's a Catch-22 in effect: A language will not be used unless it is popular, but it's not popular unless it's used by many people. (Edelman, MIT)

Since the only point of parallelism is performance, high quality object code required at the introduction of the new language; this is seldom the case, and of course adds to the cost issue. (Several)


So it's good that you do have in mind that an excellent set of tools to accompany Epoch is an absolute must for good adoption.
March 21, 2010 07:52 AM
evolutional
I'm not entirely sure of what your new goals are about. What do you mean by metaprogramming in Epoch? Are you talking about allowing people to write languages and extensions on top of the Epoch core?
March 23, 2010 02:35 PM
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Profile
Author
Advertisement
Advertisement