Thoughts from SoCal VR Conf

Last Sunday, Jeremy and I attended the SoCal VR Conference at UC Irvine.  I was excited to try out some real virtual reality equipment instead of just reading about it.  To be fair, I do have a Google Cardboard, but it’s just not the same thing–the sweet spot (area of the lenses that are in focus) on Cardboard is way too small to give any real immersion.

The first demo, created by Liv Erickson, used some early WebVR APIs to generate a VR scene in a browser which rendered to an Oculus Rift DK2 without positional tracking.  The frame rate was a noticeable distraction for me, but it was cool to see what the future of the web may hold.  I don’t really understand the use case though.  Would users sit down to their computer to browse for products on Amazon and then strap on their VR headset to look at products in 3D?  Will people build web games in VR?  Time will tell.  If you’re already a web developer, this looks like a great way to experiment with VR.  Check out Liv’s blog or talk to get started.

The next demo we saw was by Gear VR game called Dandelion.  Here’s a video that shows what it looks like.  I found the low poly art style to work well.  We talked to David, the programmer for the game, and he talked about what his team had built and what they’d like to do with it going forward.  The game was a fun little experience where you are a tiny creature exploring a kid’s backyard.  For movement, it used Samsung’s (xbox-like) gamepad controller.  I only played for a few minutes, but was starting to get a little sim sick by the end.  The nausea wasn’t very strong but it stuck with me for a good 30-45 minutes.  First person artificial (not from moving your body around) movement, especially rotation, is going to be a big challenge for VR.  I really like the idea of a slow-paced exploration game (maybe with some physics or logic puzzles thrown in), but that locomotion problem really needs to get figured out.  Maybe Valve‘s room-scale VR is the only way to go… If so, I hope the kids don’t mind sharing a room because daddy needs to build a holodeck. 😉

We took a break to get some lunch and let my stomach settle before heading back in for another loop through the expo.  This time, we were lucky to stop by Studio Transcendent‘s booth where they were showing off a beautiful history of flight air show on a DK2 with position tracking.  Jer got to use the nicer station with a top-of-the-line video card and sub-woofer based floor vibration platform.  Still, without that extra gear the demo ran very smoothly for me.  I only noticed one point where it seemed to stutter.  The demo consisted of planes from the Wright Flyer to modern jets taking off and flying over your head while you stand in the middle of a large airport runway at dusk (or dawn?).  There was supposed to be some voice over narrating the experience but the volume was too low for me to hear it.

During this demo, I spent some time really noticing the pixel layout (you can totally see the pixels in the DK2).  For thin lines, like the painted lines on the runway when viewed nearly edge on, you could get something like a Moiré pattern where the line could “hide” between the pixels and become extra aliased.  This was especially bad for the thin lines that ran nearly horizontal through the field of view (I’m sure near vertical lines would have the same problem).  Rotating my head 45 degrees to either size would smooth everything out.  I wonder if the headset makers should account for this problem by creating displays where the pixel layout is rotated 45 degrees.  (In games, as in the real world, many straight lines are nearly vertical [like the edge of a building] or horizontal [like the edge of a counter top].)  Probably, they’ll just push for higher resolution and optics that subtly blur the pixels to solve the problem that way.  We chatted with Ian and John about flight sim games and next-gen (what will be consumer) virtual reality gear.  It sounds like it’s going to be amazing.

At the end of the day, I had a great time and it really sparked my interest (as if it wasn’t already) in wanting to get involved in VR in one form or another.  I wouldn’t recommend the DK2 for consumers (and probably not for myself), but I think the first generation consumer hardware is going to be great (and just imagine what the 5th generation hardware will be like!).  I hadn’t given much weight to the sim sickness problems before, but now I see them as being so important that I’m sure they’ll shape what genres emerge on this new medium (sorry FPS fans).  I would have liked to be able to try out the HTC Vive or Oculus’s CV1 prototype, but sadly that wasn’t in the cards.  Social experiences are also going to be interesting, but AltSpaceVR‘s booth was busy so we didn’t try that out.

Migrating from SVN to Git

I’m in the process of cleaning out my old PC and moving the content over to my iMac.  As part of the process, I’m converting an old Subversion repository that held a bunch of my side projects into several smaller Git repos.  Here’s the general flow for each of my projects (Warning: these notes are really just for me.  I’m still learning git and am probably doing this wrong.):

git svn clone svn://localhost/ –username=aaron –preserve-empty-dirs –authors-file=authors –no-metadata –include-paths=”^path/to/project1″ project1

cd project1
git filter-branch --prune-empty --subdirectory-filter "path/to/project1" HEAD

I repeated that  a few times, then to get the miscellaneous scraps, I did:

git svn clone svn://localhost/ --username=aaron --preserve-empty-dirs --authors-file=authors --no-metadata --ignore-paths="^path/to/(project1|project2|project3)" misc_svn

cd misc_svn

git filter-branch --prune-empty

I don’t know if that last step was required.

Once I had working git repos on the PC, I shared the directory that contained them all, and connected to that folder from my Mac.  Once connected, I could easily clone them with:

git clone file:////Volumes/git/project1/

So far, it looks like it worked well.  Each project has just the parts of the log that it needs, but it still retains the history.

Big Five 2014

I’ve taken personality tests in the past (yep, still INTJ), so I thought I should try out the Big Five and to see how I land and so that I can check it again in the future to see how it changes.

From, I ranked (1-5 scale): Openness – 3.9, Conscientiousness – 3.2, Extraversion – 1.9, Agreeableness – 3.8, Neuroticism – 3.1.

From, they have a very long/detailed “Big Six” test.  I took all sections of the test (even though you could quit out early) and it showed me as middle of the range for nearly everything: Extraversion – 5/9 43rd%, Openness – 5/9 50th%, Agreeableness – 5/9 48th%, Integrity – 5/9 54th%, Emotional Stability – 5/9 47th%, Conscientiousness – 5/9 49th%.  I wonder if it’s so middle-of-the-road because tented to respond to the prompts with “slightly accurate/inaccurate”.  My report details are here.  Hmm… I tried retaking it quickly and avoiding the “slightly” answers and it gave the exact same report.   Grrr.  Maybe I’ll try the sort version of it from a different computer.

Update: I retook the sapa-project Big Six test the next day.  This time I completely avoided the “slightly” answers and just did the core 100 questions.  The detailed results are here. In summary: Extraversion 4/9, 33rd%, Openness 5/9, 50th%, Agreeableness 5/9, 53rd%, Integrity 5/9, 59th%, Emotional Stability 5/9, 42nd%, Conscientiousness 5/9, 44th%.  Compared with yesterday’s test, there really isn’t as much of a swing as I expected with the new strategy.

From OutOfService results (0.0-1.0 scale, percentile) Openness – 0.8,  70th%, Conscientiousness 0.556, 41st%, Extraversion 0.188, 4th%, Agreeableness 0.556, 27th%, Neuroticism 0.438, 32nd%.

From (1-5 scale, percentile): Extraversion – 1.8, 9th%, Conscientiousness 4, 76th%, Neuroticism 2.6, 32nd%, Agreeableness 3.4, 22nd%, Openness 4.1 47th%.

From SimilarMinds: Extroversion – 16%, Orderliness 64%, Emotional Stability 48%, Accommodation 42%, Inquisitiveness 82%.

From Results: Low on Extraversion, Moderately High on Agreeableness, Moderately High on Conscientiousness, Average on Emotionality, Moderately High on Intellect/Openness.

Using Wikipedia’s terms, it looks like the overall is something like:

  • Openness – Medium/High
  • Conscientiousness – Medium/High
  • Extraversion – Low
  • Agreeableness – Medium/Low
  • Neuroticism – Medium

I’d be interested to hear what people who know me think, and how you get ranked.  Is this useful for anything?

More notes from dotScale 2013

Instead of posting individual posts per videos, here are some rough notes from a bunch more of the dotScale 2013 videos.  This is not a complete list.  Any mistakes are my own.  Watch the videos if you want a direct source.

dotScale 2013 – Jonathan Weiss – DevOps at scale
Works for OpsWorks
First Rule: Things will break, plan for it.
Divide and Conquer, decouple
“Limit the blast radius” if a component goes down or becomes slow, other components should continue working
Deploy new version to 1 of many many machines, measure (latency, CPU, Memory, key performance, errors, etc…) Then, if it’s good, roll out to more. Requires supporting multiple versions concurrently. Staged rollout. Automate this.
Have good backup/restore and disaster recovery strategy. Practice it frequently.
Chaos Monkey – Introduce failure daily so that you can be sure to handle it automatically.
Measure everything that you can.
Lots of testing and auto-reconfigure based on goals.

dotScale 2013 – Nicolas Fonrose – Welcome to your new job
Architects know to worry about: reliability, latency, speed
Now we need to worry about cost.
Everything we do in the cloud has a cost.
Cost Driven Design
Lots of opportunity to save money by using cloud computing/storage

  • Script everything, never work manually, don’t use graphical interfaces, use the API
  • Measure everything: not just performance, but costs
  • Continuous Management of cost: like build automation, but cost improvement
  • Correlate between all actions and cost to get a good big picture view

dotScale 2013 – Thomas Stocking – Virtual Network over TRILL public cloud provider
Need layer-2 network isolation for large scale multi-tenancy
VNT = TRILL + VNI (TRILL = Transparent Interconnection with Lots of Links; VNI = Virtual Network Identifier)
Planned to open source it in the next year

dotScale 2013 – Stanislas Polu – Uses of tmux explained
Terminal multiplexer. Let’s you switch between multiple applications in one terminal.

dotScale 2013 – Quentin Adam – Scaling, what you need to know
Founder of Clever Cloud
Scale up or Scale out (hint: out is better)
Scale out = many workers doing the same thing, avoid single point of failure, easier to grow
Differentiate between process and storage
Storage: Database, files, Sessions, *Events*, user accounts, user data
Process: Can be replicated, stateless, process (takes data, transforms data, produces data)
Statelessness is key
Choose data store wisely (probably choose multiple data stores for different parts of the system)
Example questions when choosing a data store:

  • Do I need atomicity of requests?
  • Do I need concurrent access (read? write?)?
  • Do I mostly read or write?
  • Do I need relational?
  • Do I need big storage capacity?
  • Do I need high availability?
  • How long do I need the data?

Use an online (Internet based) database to test ideas before messing up your computer with installing software.
Don’t start with technologies (Node.js + Mongo) and then ask yourself what problem/project you’ll solve/build. Start with a problem, then find the right technologies.
Balance learning curve with time saved.
Don’t make monsters (technology twisted to do something it wasn’t designed for). For example job queue built on MySQL and Cron.
Common mistakes to avoid:

  • Don’t use RAM as data store (avoid shared/global variables; can’t scale, will cause error).
    • RAM should be used to take a bit of data, process it, then dump it.
    • Processes should return the same output for a given input.
    • If you store in memory, code will fail, and data will be lost.
  • Don’t use the file system.
    • Store in a database or something like Amazon Simple Storage, memcached, couchbase.
    • File systems don’t scale, they’re a bottleneck, creates coupling to the OS/host provider/language, it’s a single point of failure.
  • Be very careful with dark magic
    • For example, read how frameworks work (eg Scala Lift is stateful)
  • Split your code into modules
    • Keep the code per module small (makes it easier to find the bugs)
    • Modules should act as services to eachother
    • Choose the right technology per module
  • Use event broker to modularize the app
  • Make hard computation async
    • Always use a reverse proxy
  • Use process deployment
  • When things fail (and they will)
    • Keep calm, get metrics, find the bug, fix the bug

Evolution of service deployment links

Here’s another video from dotScale:

Dave Neary – Evolution of service deployment

Dave include a slide at the end with a bunch of logos.  Here they are in as links:

Software for Scale

Here’s an interesting talk that I just watched:

dotScale 2013 – Brad Fitzpatrick – Software I’m excited about

And some links to go with it:

  1. Fast, cheap, reliable VMs
  2. Good machine + cluster management system
  3. Good programming language
  4. Good lock server
  5. Bonus – Money
  6. Brad’s Side project

Recipe: English Toffee

Read all of the directions and plan it out before you start.  You should do some steps in parallel.

Cooking time: 35-45 min.

What you’ll need:

  • 1 cup water
  • 2.5 cups sugar
  • 1 lb butter
  • Whole almonds (Optional)
  • 3 bars Hershey’s milk chocolate
  • Long handled wooden spoon
  • Deep pot
  • Candy thermometer
  • Cookie sheet
  • Microwave safe glass bowl
  • Rubber spatula

Bring sugar and water to a slow boil, cover, and boil on medium heat for 4 min. (No peaking!)
Create cooling bath that you can set the cookie sheet in.
Apply thin coat of butter to the cookie sheet.

Carefully add butter to syrup one stick at a time and stir. Wait until each stick is fully melted before adding the next.
Add a large handful of whole almonds to mixture.

Stir on medium heat until 310 deg. F. (Hard Crack).
Turn off heat.
Wipe off the thermometer on edge of pan to clean.
Pour onto cookie sheet.
Place cookie sheet in cold bath.

Wait 20 min, then apply chocolate.
To melt chocolate:
Place 3 bars in glass bowl
Microwave 50% power for 1 min, stir, and then for another min. at 50%.
Spread colocate onto the toffee with a rubber spatula.
Optional: Dust with finely chopped almonds.

Once everything is cool, crack and enjoy with your friends.

Don’t try to make two batches at once.

Duck, Duck, Dog

Duck, Duck, Dog LogoI’m happy to announce the release of Duck, Duck, Dog; a concentration and memory game. I’ve been working on this off and on for the past couple of weeks, and now I think it’s finally ready to be released. The game is based on the n-back task which has been used to test and train a person’s working memory.

You earn points by correctly matching the symbols with what has previously shown, and you get more points the more matches in a row that you get right. You can also earn a lot more points at higher levels, so don’t spend all of you time on Level 1.

I hope you enjoy the challenge. Good luck!

Style Tweak

I tweaked the layout of font of my blog a bit. It’s now 100px wider and the font and line-height are a little larger. Hopefully it’ll be a little easier to read on high resolution monitors while still fitting nicely at 1024×768. Check out the Before and After, and let me know what you think.

IE9, Firefox 4, and Chrome 10

The current generation of browsers is looking pretty good.  Not all of them support all of the features that I’d like to see (WebSockets, WebWorkers, HTML5 Forms, full CSS3, etc…), but they all have fast Javascript engines, canvas and audio tag support, which makes makes things like video games a lot easier to build on the web.  A few weeks ago, I built this little page to play around with drawing some animation on a canvas tag.

Canvas Sparks Screenshot

Canvas Sparks


One thing I like about it is that it gives me a sort of a benchmark to compare the different browsers.  For example, I had some real trouble getting the hidden footer bar to show up when you move the mouse over it in Internet Explorer 9 until someone from Microsoft suggested a workaround*.  I’m also able to measure the frame rate, which is a good indication of how fast the browsers are at rendering (at least for this scene).  Here are the numbers on my PC running maximized with the default settings:

Browser FPS
Internet Explorer 9 115
Firefox 4 45
Opera 11 25
Chrome 10 21

While this makes Chrome look pretty bad, it’s actually quite fast and comparable on most tasks. I’ve come to prefer it for most of my browsing (although, that was before IE 9 and FF 4 were released, so who knows if I’ll switch again). I tried Chrome with GPU acceleration enabled, but it actually rendered slower and incorrectly (there’s a good reason Google hasn’t enabled this feature by default).

On my iMac, here are the numbers:

Browser FPS
Firefox 4 41
Opera 11 40
Chrome 10 30
Safari 5 29

I’d be curious if anyone else is getting different rankings. Just mouse over the footer area to see your frame rate, or to play with the other options.

* Oh, and if you’re interested in the workaround. The problem was that the content of the footer DIV was hidden with display:none, so IE wouldn’t register the mouse being over it. Apparently, IE uses something called hard and soft hits to trigger the :hover selector. An “empty” DIV is treated as a soft hit. The workaround is to set a transparent background on the DIV, either with a transparent image, or with background:rgba(0,0,0,0) in the CSS style.

« Previous Entries