The latest news, project announcements, etc.
(I fielded this question recently and since it seems like a common one, I decided to post the gist of my answer for future reference. It's also written from the position of a web developer, but mostly applies to other sorts of development too - although game developers obviously have special platform requirements.)
Let's start with the counter-arguments first - why don't developers want Windows or Linux machines?
Lots of developers have Windows machines at home, certainly - usually for games or because someone else in their house is used to that platform.
But Windows as a development platform is a mess. The familiar suite of Posix tools available on other platforms aren't really there - or worse, don't really make sense - because Windows' architecture is a awkward, monolithic blob. The GUI is also getting steadily more hostile to power users too, with the mildly-obnoxious Windows XP/7 interface being steadily buried under a pile of 'Metro' eye-candy in more recent versions.
Finally, provisioning hardware for Windows is a hassle. While compatibility is rarely a problem, there's a bewildering array of providers, and there's steady downward pressure from businesses on price, making it hard to invest in a serious productivity machine when more 'affordable' systems are available.
So if developers want better configurability and a geek-friendly interface, why don't they use Linux? It's what they're usually developing for, right? Linux-based mobile platforms, embedded Linux devices, Linux-fuelled web servers and virtualised environments - it just makes sense to be running the same platform on your development machine.
So close, and yet so far. While Linux on the desktop has come up in leaps and bounds, it still often requires a lot of tweaking to deliver a solid, fully-featured environment, and there's regular issues stemming from the rapidly changing ecosystem that is desktop Linux.
The same hardware issues that Windows users have are also multiplied with compatibility problems - and while the performance requirements for running a version of Windows are relatively well known, what level of hardware would support a substantial development machine under Linux is usually a lot less obvious.
So what does Apple get right with their ecosystem of products that make them so desirable? In many respects it's the fact that they offer less options, not more, that keeps geeks coming back - as counter-intuitive as that may seem.
Hardware is the most obvious place to start (Apple is, after all, a hardware company.) Apple laptops are solid, stylish, and extremely well designed - they 'just work', and are very comfortable to use. And whatever you think of Apple's business practices and manufacturing ecosystem, they are certainly able to offer high performance machines in very portable form factors, at relatively affordable prices.
And because they only support their operating system on their own (very constrained) range of hardware, it's relatively stable and integrated - features like power-saving and sleep/restore that (in my experience) "mostly work" on Windows and "sometimes work" on Linux-based systems are very reliable on Macs.
That constrained product range also makes it easier to justify to an accountant too. "The latest 15-inch iMac" is very easy to describe and order, whereas the equivalent hardware for the PC ecosystem would require a lot more explanation, and possibly justification.
OSX also reflects this 'less is more' aesthetic - while opinionated, it at least gets out of the way for most tasks, and its steadily growing suite of bells and whistles are predominantly opt-in.
It's not a perfect platform, of course - glossy screens, Thunderbolt over USB and an endless parade of display connectors are the price we pay for the nice hardware. And on the operating system side, OSX isn't immune to changes that break the way developers try to integrate it (XQuartz is a good example) and it's missing basic developer tools like package management, which requires hacks like Homebrew or MacPorts. But with tools like that in place, and all the other advantages I've discussed, even fairly anti-Apple developers like myself have been forced to acknowledge the platform as a compelling one to develop on.
In conclusion, we want the system that allows us to be most productive, and Macs generally represent the best set of compromises in this respect. So while they may seem expensive or hipster, ask yourself - what's more expensive in the long run, your developer's time, or the machine they spend it on?
As is so often the case, my attempts to sleep one evening were stymied by an image of something I'd like to build - in this case a toy lantern, cut from laser cut plastic and plywood.
Well a few days later I sat down for an hour or two and sketched out the pieces you can see in the first image, spread them out into Ponoko's template files and sent it off - and today the bits arrived.
As usual the results are lovely - the Ponoko folks added a few extra layers of protective film to hold the small pieces together during transit, and the consistency across all the cutting is really nice. As you can see, I immediately spread the lot out across my desk at work - I had to find out if my assumptions about the way the model would snap together actually worked.
All in all the design is pretty solid - I continue to fiddle with tolerances on some stuff (in this case the laser took less material than I expected) but I think with a little tweaking I'll have a design that's not only cute, but might actually be popular.
So the first thing I noticed after making my previous post
was that I didn't have any of the necessary resistors in place, which was disappointing. But then I realised that Fritzing have a very enlightened approach to order modifications (I imagine lots of designers have last minute ideas after they've submitted their boards) so I was able to fix the design they'd already accepted.
Then, waiting - always the most frustrating part of using any of these awesome services. (As I write this I just took delivery of a load of laser-cut stuff from Ponoko
- that wait was also agonising, but more about that later.)
But finally it arrived! As a quick check I used the header block to stake the board to my Arduino - looked like I hadn't messed up too
badly, so time to take the big step - soldering. (Anyone who follows me on Google+ will recognise the image to the right from there.)
Working one LED at a time, I tweaked my Arduino code to verify each component as I soldered it into place and, miraculously, everything worked!
And here's the proof - click the animated gif for a video of the finished product in full swing.
I'm pretty excited - I just sent my first circuit board layout off to Fritzing Fab
! (I mean, I've probably messed the header holes or the through connections or something - it is my very first attempt - but I'm still psyched.)
I've got a pack of headers and LEDs coming from Little Bird Electronics
to drop into the board when everything arrives, then it's just the fun parts - soldering and programming. :)
If you have any interest in Arduino you can probably tell from the picture that it's designed as an extension shield - if all goes according to plan it will drop into place on top of my existing Arduino Uno.
I noticed recently that Mozilla are running a competition
to create JS games with Goo Technologies' new toolkit
, so I thought I'd try it out. (It has a lot in common with Unity3D, so I thought I'd be able to adapt a few ideas across and have *something* in a single, time-bounded session.)
It was great practice for my JS skills, and I learned a lot about serious web apps - but mainly I learned the difference between a good JS library and a *great* JS library.
My experience with Unity has always been - if I need something, there's documentation for it, and MonoDevelop can autocomplete for me in C#-land, giving me a super easy way to explore the API. The Goo Engine, on the other hand, only seems to be available a minified JS file, and the documentation is sparse to say the least, making discovery incredibly difficult. This was thrown into strong contrast when I decided to fold in a physics library, and chose CannonJS (which is actually bundled with Goo): Cannon has significantly more readable documentation, and is also available unminified, so I could just jump into the code to unpick some misunderstanding or bug. (This is a habit I've picked up from Ruby on Rails - learning and debugging by just digging into the library code directly.)
Goo, in their favour, provide some recipes for various common tasks
- but these are inconsistent, and if your problem isn't covered, you're out of luck. It all points to a library that has grown incredibly fast - they have an impressive amount of functionality (a substantial chunk of the core Unity stuff, I suspect) but the rest of the project is struggling to keep up with the rapidly changing code. I'll certainly be following their progress, and look forward to playing with the Goo Engine again when it's a little more mature.
I've had the idea for an animated cube character for a long time, and recently I finally built a basic version in Blender
Since then I've been experimenting with a few different things in Unity3D, resulting in this Cubiques Prototype
It's my first time using Unity's particle effects and animation state tools - I'm hoping to add more animation behaviour, and ultimately expand this into a game.
My most recent narration has landed on Protecting Project Pulp #59
- 'The Opener of the Way' by Robert Bloch is, "A tremendous tale about the dread doom that overtook an archeologist in that forgotten tomb beneath the desert sands of Egypt". Enjoy. :)
Chronic and Chronic Duration are super useful gems for parsing natural language descriptions of dates into Ruby objects (DateTimes and Fixnums, respectively). They're both popular with Rails developers, but when new developers ask about integration into their app, the answer is usually 'roll your own
After adding support by hand in this manner a few times I knew it was time to break my work out into something I could re-use. Thus, Chronorails
. Let's start with an example.
For a hypothetical Rails model:
class RomanticMeeting < ActiveRecord::Base
attr_accessible :length, :start # Integer and DateTime DB fields, respectively
chronic_field :start, :required => true
…include the accessors module, and configure Chronorails to wrap your attributes with either Chronic or Chronic Duration virtual attributes.
Then in your form:
<%= f.text_field :chronic_start %>
<%= f.text_field :chronic_duration_length %>
…you can use the virtual attributes for your fields, entering natural language date and duration information that will be parsed into the regular fields (or will generate validation errors).
The ‘required’ option prevents setting the attributes with blank values; the ‘validates’ option controls the generation of validators (defaults to true) and the ‘accessible’ option controls the generation of Rails 3 ‘attr_accessible’ calls (also defaults to true.)
I've workshopped this concept and code with a few developers, now I'm hungry for feedback from a wider field. Let me know if it suits your needs, and preferably, if the code inside makes sense. Fork it on github
and have a play. :)
Brand New Media
's web department works primarily in Ruby on Rails, and Rails apps (like all apps) always have a certain amount of configuration. Host names, API keys - there's always some settings you want easy, central access to, or need to vary for different environments.
Rails' default behaviour is to use a central code file (application.rb) and a series of per-environment code files (development.rb, production.rb) for configuration, and that will serve the needs of a lot of Rails apps. Once you outgrow that, however, there's a lot of options - I personally like railsjedi's rails_config
, which employs a structure of YAML files to provide a clear and expressive way to capture large amounts of configuration data. One feature I particularly like its concept of 'local' configuration files
- files that capture install-specific (particularly, developer-specific) configuration, keeping it out of the code repository where it might clash with other developers set-ups.
The next level beyond this, though, is instancing - deploying the same application multiple times, with configuration tailored to each instance. At this point another of rails_config's features comes to the fore - customisable configuration paths. The pattern we decided to pursue was to set an environment variable for each instance, and use that value in in the configuration stack to find and load an instance-specific configuration file. This needs to happen after rails_config has been initialised, though, so a naive approach would be to add this logic into our application.rb, in an after_initialize block.
But this quickly leads to a chicken-and-egg problem - in the configuration stack we're both setting up rails_config, and attempting to use information from it (for example, configuring the excellent Devise authentication gem
for Facebook authentication requires the credentials of our instance-specific Facebook application.) We obviously need a more fine-grained approach to our initialisation process - one that will allow us to knit our instance configuration into the initialisation process at the right point (most naturally right after rails_config initialises itself.)
Luckily, Rails gives us this fine-grained access in the form of initializers
So here's an example of an initializer that solves the problem I've described.
class Application < Rails::Application
# Normal Rails app configuration
initializer :add_instance_config_to_rails_config, :after => :load_rails_config_settings, :before => :load_environment_config, :group => :all do
Initializers allow us to pick a place in the initialisation stack and insert ourselves there, based on the named steps described here
. In this case, we're creating a new step (add_instance_config_to_rails_config) and inserting it before the default Rails environment load event, but after
rails_config's own internal custom step.