A few days ago a coworker sent me a link to a very new html5 2D graphics library called Pixi.js, telling me to “Check this out.” I did check it out and was immediately very pleased with what I saw.
Pixi.js arrived at the perfect time for me. I had been planning to start working again on the html5 rewrite of an unfinished flash game. I had already made my own Canvas2D engine, but was considering switching to EaselJS, to conform with my general philosophy that it is better to get on with making a game than to build an engine. When I saw Pixi.js I instantly knew that I wanted to use it for my project.
The thing that makes Pixi.js so appealing to me is that is is primarily a WebGL renderer, so it prioritises the optimal performance environment in the browser, but it has a fallback to the standard 2D canvas context, so it will work is all modern browsers. The great thing about this is that the most common case where it will fall back to the 2D context, at least on desktop, is Internet Explorer, which has a decent hardware accelerated 2D canvas element.
Pixi has only just been released to the public, but it has hit the scene in a very tidy shape. GoodBoy has obviously planned this initial release well. They have ensured that the documentation is well presented, and have built an attractive and impressive demo game that shows of what the engine can do.
The demo game is instant proof that PixiJS can offer the power needed to make a great game in the browser using WebGL. There are also some benchmarks that get more and more exciting the more bunnies or pirates you add to the scene.
The documentation gives a very concise overview of how simple and usable Pixi.js is. Unlike projects like EaselJS, it offers only the features you really need. You can see at a glance what it does. It doesn’t try to be the new Flash, it just gives you what you need, in an API that will be familiar to any Flash developer. It has the standard objects: Stage, DisplayObject, Sprite, MovieClip, Texture etc. It has a full heirarchical display list, and supports JSON SpriteAtlas loading for animations. In other words, it has exactly what you want, and nothing more. No doubt it will in time have many more features, both from Good Boy Digital and from the community, but for now it seems good to go for a serious project.
Ive really been enjoying using Pixi.js, and getting on with making my game. Thanks, Good Boy Digital!
It took me a bit of experimentation and online research to get skeletal animation to work with the ThreeJS Blender Exporter. I have compiled some tips that may help others who are in the same situation. Note that at the time of this writing, support for skeletal animation is still considered experimental, and from what I can tell from the github wiki has only become relatively stable in the last few months.
These are some pitfalls / things to check if you are struggling to get things working. Note that they mostly don’t apply to exporting Morph Target animations.
Ensure the scale is appropriate.
This is more of a general tip for using the exporter. With default settings your model may be much too small for the scene. The ThreeJS examples use a scale that is much greater than that of the Blender default scene. I set the export scale to 50 for the model to look right in my ThreeJS scene.
Delete the Armature Modifier before exporting.
It appears that you have to delete the Armature Modifier from your mesh object before using the exporter, or the animations come out distorted. The bones will still be included in the export data, but if the armature modifier is on, the animation will be broken. This does not seem to be widely documented, and I only came across it in a discussion on the github wiki.
Check your Vertex Groups.
When using skinning, ThreeJS will not render parts of the mesh that have not been assigned to any bones. This can really trip you up when you are getting started. The fact that you have to delete the armature modifier may make you think you don’t have to assign it in the first place, but this is incorrect and will potentially cause the model to appear invisible or incomplete. You must have correctly assigned vertex groups on your mesh. The easiest way to achieve this is to use automatic weight generation when you assign the armature modifier. Once assigned, you can delete it again immediately and the vertex groups will remain. Examine the vertex groups in the Object Data panel, or go into weight paint mode and make sure the mesh has been assigned to the bones. Also be careful to delete all the vertex groups that you don’t need or you may get errors from ThreeJS. ( Basically you want meaningful data only in your export, and nothing else. )
Key all bones in the first and last frames of your animation.
I found that I had to insert a keyframe for every bone of the armature in the first and last frames of the animation, to describe the initial pose and the final pose. Even if the animation looked fine in Blender, without these frames the mesh would would rotate or twist strangely in ThreeJS, or would be missing the last part of the animation. Create the keyframe in Pose Mode by pressing ‘a’ to select all the bones, then press ‘i’ to insert a keyframe for ‘LocRot’ ( ie location and rotation. )
 I don’t yet understand why, but it seems that you have to key the last frame of the animation as well, or animations will be broken in the middle.
Have something to add to the list? Have I made any mistakes? Is anything out of date? Please let me know in the comments…
I’ve recently been taking a look at WebGL libraries, and I feel that Three.js is the most promising library available at this time. I also have huge respect for Mr Doob as a developer-of-cool-things, and would like to support and get involved with this project. It appears to be evolving rapidly, and many of the more cutting edge features have only been added within the last few months.
The api is very simple to understand and use, and it only takes a few lines to set up a 3d scene. There are the usual assortment of primitives, and several importers for different file formats. Of most interest to me is the Blender exporter tool, which provides a simple pipeline for getting models into a Three.js scene. Support for animation is one area that is still in development, and considered “experimental”. Morph targets worked first time for me, but it took a bit of finessing in Blender to get the skeletal animations to work correctly. I expect that in time this area will become very stable as it is used by more people.
The library also has a variety of post-processing effects that can be used on a scene with relative ease, and it supports custom shaders, so there is a lot of flexibility there.
This is an important time for developers who work with browser-based technology. WebGL and WebAudio are the most exciting to me personally, as a game developer. Three.js is a great starting point for experimenting with WebGL, especially if you are more interested in really getting something done than just “checking out the tech.” One of the reasons I admire Mr Doob’s work is that he has always used the technology to create wonderful and innovative experiments and experiences. Now he is helping to enable others to do the same… Thanks Mr Doob, you are awesome!
Last month, just before the Easter holiday, Ninja Kiwi released its latest game: Battle Panic.
I was coding this project for about five months, since the previous October. I had the excellent experience of being provided with quite a lot of completed art and character animation before I even started, by the amazingly talented / jealousy inducing artist and animator Warwick Urquhart, so it was looking good right from the beginning. No programmer art in sight. It was awesome working with Warwick on this one. THANKS WORIC!!!
I got to do some of my favourite kinds of development: making little autonomous guys run around ‘interacting’ ( fighting ) with eachother.
In the next version the castRay() function will return an object containing the edge normal and ID of the hit tile. Later, I’ll add handling for fine collisions with tiles having “interesting” contours.
While in this example I’m using the ray to find a collision with a stationary obstruction ( a solid tile ), the same algorithm can be used as a broad phase for things like projectile collisions on moving targets. It would work nicely as an extension for Grant Skinner’s grid based ProxityManager class.
I’m certain there are optimisations I could make at this point. All feedback is welcome…
As a long-time OSX user, I managed to miss out of the joy of FlashDevelop until earlier this year when I started at my new job. When I went back to Mac, I really missed all of those convenient features. As much as I love the universal power of Textmate as an editor, I was completely converted to the pleasures of auto-complete, auto-import, etc.
I struggled for a long time to find a decent alternative for OSX but there really was nothing I liked. Eclipse had some badly supported plugins, and the alternatives were the expensive Flash Builder from Adobe, and FDT. ( None of which were at all close to the clean usability of FlashDevelop, let alone the zen-like Textmate. )
Then, I happened to read about the FlashDevelop Bridge project. This has totally solved my AS3 development needs on OSX, with near perfect integration into the operating system.
The concept behind the Bridge is that you virtualise Windows and run FlashDevelop there. The Bridge runs as a server on the host ( OSX or Linux ) and talks to FlashDevelop over the divide. This communication allows FlashDevelop to signal the host OS to build using either the Flash IDE or Flex, ( ie, on the host operating system). This means the virtualised Windows doesn’t have to do any of the heavy lifting, and therefore won’t slow you down.
2 – MS Explorer: Tools > Map Network Drive, map Z: to \VBOXSRVDev
3 – Mac Bridge: configure Z: as local /Users/yourname/Dev
4 – FlashDevelop: Program Settings > BridgeSettings, verify drive and set ‘Active’
5 – restart FlashDevelop
Note that you can call the shared folder whatever you like. Don’t forget step 2. This is easy to neglect because after you share the folder through VirtualBox, it will show up in explorer, but you still have to map the network drive for Bridge to work.
As an optional but highly recommended final step, install KeyTweak on windows, and remap the left command key ( windows will see it as the left windows key ) to be a control key. If you don’t do this it will drive you insane switching from ctrl-c to cmd-c keyboard shortcuts all the time. If you chose to do this step you’ll also have to open Virtual Box preferences and on the “input” tab change the ‘Host Key’ to be the right command key, rather than the left, since we are now using that for control. This is the key you press to release keyboard focus, and is basically the only taint on otherwise perfect OS integration. Before you can cmd-tab to another application you have to either press the host key to release focus, or click out of virtual box. It is a small price to pay for FlashDevelop on OSX, and you get used to if after a short time.
Hungry Sumo for iOS is now in the App Store. This is the first full-fledged iOS title I’ve developed. I’m very happy with how it came out. It was exciting to see it appear in the iTunes store. This version has 100 levels, and 4 mini-games. NinjaKiwi FTW!
I was recently asked to give my opinion on Booktrack, a new “technology” that adds a synchronised audio soundtrack to e-books, that you read on your mobile device. The company describes the technology as “the largest enhancement of reading.” So far it seems that the trend is to apply a severe lashing to the whole idea.
I approached it with an open mind, but in all honesty I can say that this amazing new technology is a step backward for literature, and serves only to hobble the reading experience, rather than enhance it.
The first thing you notice as you begin to read is a little arrow that moves slowly down the right hand side of the page. Presumably this is set to an estimate of the the average adult reading speed. From the start, you feel that you are either racing the little arrow, or stopping and waiting for it to catch up. As you turn the pages the arrow adapts to your reading speed, so if you read at an even pace it does a decent enough job of being in roughly the right place at the right time. But because it is never totally accurate, the sound effects in the audio track can never be too specific, lest they be completely ill-fitting at a given moment. Punctuations timidly punctuate, surrounded by long periods of pattering rain or crackling flame.
So, here’s the real problem: Reading is not supposed to be a passive activity. It is imaginatively active. The author works with the tool of language to encode the narrative, and the reader then works to decode it. A magical visionary phenomenon occurs – the reader’s external senses are plugged shut and the sense-impressions encoded in the writing take over. That is immersion. That is how the world “melts away”. The reason Booktrack is so universally criticised is that an audio soundtrack to accompany the magical act of reading is just redundant, distracting, or if you want to get worked up about it – a disgusting perversion of the pure art of literature.
I particularly enjoy the example of Moby Dick in the promotional video, “enhanced” with the sound of waves and whalesong. This is necessary, of course, because Herman Melville was not capable of evoking a vivid image in the reader’s mind, because he did not spend twenty or so pages describing every impression in the most detailed language possible, because Moby Dick is not “totally emersive” enough already.
Another criticism I have of Booktrack is that it will only ever be suitable for action-based narratives. Anything thoughtful, cerebral, philosophical, will simply not work. Therefore any body of literature based around this technology will necessarily be limited in range and scope.
I don’t think it is just the gut response of a prose purist to say that Booktrack can only serve to detract from what greater minds have made. The soundtrack can only distract from the impressions the author has worked so hard to evoke. For a product with so many big names behind it, it is a real Disaster.