Over the past year in particular many of the graduates applying for programming positions at Tag Games have portfolios that consist, in their entirety, of Unity 3D projects. It seems that many of the applicants coursework, games and out of hours projects are made solely using Unity.
Now I like Unity; it’s a great tool and when used in the correct scenarios allows for rapid development of 3D games and prototypes. However Unity also has its flaws (like any tool) and despite popular thinking is not always the best solution for every single problem in game development.
What worries me most is that some (not all) of these potential candidates’ programming and game development knowledge starts and stops with Unity and even more alarmingly the industry seems to be encouraging this.
A couple of my colleagues attended the recent Develop conference in Brighton and one of the main themes of the seminars seemed to be use Unity for everything; it’s cheap, it’s powerful; why use anything else? Is the mobile industry becoming overly dependent on Unity? If Unity was pulled from the virtual shelves tomorrow what impact would that have on the App Stores? This may seem a hypothetical question but this kind of scenario has happened before most notably when RenderWare was bought by EA . Recent rumours even suggest that Unity were subject to a bid from Autodesk; imagine if Apple bought Unity and only licensed it to Apple developers, does that sound so far fetched? Imagine the huge blow that would deal the Google Play Store literally overnight! This isn’t just true of Unity, but the risks are magnified as many companies entire livelihood is dependent on a single piece of closed-source software and one that doesn’t have many viable alternatives. It’s one of the reasons why at Tag Games we have our own in-house engine that often wraps different middleware tools from FMOD audio to Bullet physics. If for any reason we need to switch middleware we can without have a huge impact on our game code (however that doesn’t stop us using Unity if the task requires it).
Obviously I am not suggesting that every developer should create their own engine; one of the main benefits of using a standard tools is that new employees are often already familiar with the tech. However the fact that so many companies use Unity is possibly one the reasons that so many graduates spend so much time learning the engine. In my opinion this should not be at the cost of general, core programming techniques. Unity should be a tool that people learn much in the way you would learn SVN and GIT or C++ and Python. If Unity no longer existed a candidates entire CV should not be reduced to nothing more than scrap paper.
When presented with a portfolio recently that included a lighting demo in Unity one of my colleagues asked what was so impressive, “doesn’t Unity do that for you?”. He really wanted to know if the applicant understood the algorithms for calculating lighting and the skills for programming an efficient shader but ultimately all the applicant had learned was how to add a light component to a Unity scene. Unity should be used for demonstrating high level achievements such as making a game or for showcasing techniques for which rendering is just an aside (if you are creating algorithms for mimicking realistic smoke you don’t want to spend 20% of your time creating a particle renderer).
So continue to use Unity to make games and tech demos but also understand what it is doing under the hood. Understand the basics of rendering, shaders, scene management, etc and remember that contrary to popular belief not every company is using Unity. In-fact I’d wager that most games are still written in C/C++. I just wonder whether the mobile industry should be investing more time in open-source technologies in case EA decide to get their wallet out again!
As an aside I wonder if Unity themselves are finding it difficult to hire candidates with the appropriate skills?