Zappar, the makers of ZapBox, have provided their Kickstarter backers with a beta project to play around with, prior to the arrival of their Zapbox kits.
The aim of their project is to help developers create mixed reality content and for users to experience that content without the need for expensive hardware. Going the route of Google’s successful foray into accessible virtual reality content via Cardboard, Zapbox is a simple cardboard viewer which holds a smartphone running the ZapBox app on iOS or Android. Using printed black and white markers, the smartphone’s camera is able to map the surrounding environment and position mixed reality content.
Cheap and fast prototyping
As per my previous post, I’m excited about being Project Backer Number 403 and adding ZapBox to my prototyping toolkit.
Rapid prototyping with tools such as paper, cardboard, role playing and game pieces is important, but there comes a point working in the mixed reality space when you really need to get a feel for where objects are located in 3D space, how they appear against the real world and how users will interact with those elements. The ability to test a hypothesis, quickly and cheaply, can save a lot of time and guide content creation down more fruitful avenues.
There is certainly a gap in the market for a cheaper price point, even in these early days of mixed reality. For example, the development edition of the Microsoft Hololens will set you back $4,369 while the Meta 2 development kit costs $949. But how does ZapBox compare in terms of features and functionality? The ZapBox team undertook a review of the mixed reality hardware landscape to provide a comparison.
Although ZapBox has strong technical features it lacks an important feature compared to other hardware: markerless tracking. My understanding is that printed markers are required to:
- identify a real world object that has mixed reality properties
- position mixed reality objects and
- map the physical environment in which mixed reality elements will move.
Despite this weakness, I still believe it will be a great tool for early stage testing of concepts and basic interactions with users.
Testing out the demo for ZapBox beta
The demo included:
- 3 x PDF printouts for the controller (shaft, diamond/ hexagonal dome and cap)
- 1 x PDF printout for the tracking markers
- ZapWorks Studio file
I printed and cut out the controller PDFs on standard A4 paper. Since the shaft was going to have the most handling I decided to reinforce it with lightweight cardboard. I measured and noted the same fold lines as the printout, scored the cardboard along the fold lines with a blade then glued the printed sheet. This felt much sturdier than paper alone.
When I tried to attach the controller diamond to the shaft it was pretty fiddly. Then I realised there were little lines along one side of each hexagonal piece. I’ve marked this in red in the image below. This line actually represents a cut. I recommend using a blade to create these rather than scissors. The blue flaps then tuck neatly into each adjoining cut. (Hey Zappar! If you’re going to use these PDF templates in the future, please indicate “cut lines” using colour, a dotted line or something. Thanks!)
After some fiddly handling I inserted all the flaps in the diamond then glued it to the shaft, adding tape for extra strength. Voila!
Now it was time to test it out. I opened the ZapBox beta app on my Android smartphone and clicked Get Started.
Next I scanned the demo ZapCode. When the code had finished loading (just a second or two) I focused my camera on the controller and markers.
Success! The controller diamond transformed into a blue flower while the tracking sheet displayed a large 3D button. Using the controller/flower I could press the button. This resulted in the button depressing, my smartphone screen flashing red and an alarm sound.
A few notes on my own user experience
I felt it was important to document my initial user experience. Too often, we rush into using new technology, eager to figure out how everything works before optimising for maximum efficiency. But we soon forget the novelty of our first experience. The new features that surprised and delighted us. Or the frustration and confusion of something that wasn’t intuitive. (It doesn’t work. Is it just me? Who the heck designed this?) Standing back and reflecting on our own experience is also a good reminder down the track, when we design new interactions or our designs are placed in the hands of new users.
- Not hands free, yet. It was fiddly holding my smartphone in one hand and the controller with the other while trying to aim the camera in the right direction. But this shouldn’t be an issue with ZapBox as it will be strapped to one’s head, leaving hands free to move and manipulate the controller(s).
- It’s pretty! The controller diamond’s flower was aesthetically pleasing. It was a nice middle ground, not too cartoon-like but also not trying too hard to be realistic. Blue was also a good colour selection creating a strong contrast against the real world environment.
- Tracking. The app was able to render the controller diamond’s flower very well. Moving closer to the flower allowed me to see more detail. An interesting side note is that when the controller’s diamond cap was removed, I coulld see “inside” the flower. Not sure if this was part of the design, but it was cool.
- X, Y, Z movement. Moving the controller to press the button was a little tricky. I had to get my head around how to move my hand through Z space, not just up and down. It didn’t take long to get used to but it wasn’t as intuitive as I anticipated. I recall a familiar experience when I learned how to use Leap Motion for the first time.
- Absence of tactile/haptic feedback. Pressing the button was cool but I missed the satisfying sensation of resistance as the button was “clicked”. The Zappar team’s use of visual (red flashing screen) and audio cues (alarm) is a great way to compensate for this feedback absence. It provides the user with positive feedback as a direct result of their physical action. This will be an important feature in the design of future mixed reality user interfaces.
Developing content with ZapBox
There are several video tutorials available on the Zappar YouTube channel which cover how to use ZapWorks Studio to create content and program interactions. I’ve been through all the tutorials and it looks fairly straightforward. So the next step will to use the controller to create an experience of my own. I’m keenly awaiting the arrival of my ZapBox through the Kickstarter project. In the meanwhile I already have a few ideas in the design pipeline.
So is ZapBox the ultimate mixed reality rapid prototyping tool?
I’ll only be able to answer this question once I’ve had the chance to create a prototype but so far it looks good.
- Inexpensive development. Apart from purchasing some ZapCodes ($1.50 each through a Personal Developer account) there is no cost to create prototypes. The only other materials you need are paper and cardboard. All content is hosted on Zappar’s ZapWorks Studio platform.
- Fast development and iterations. Changing content is as easy as updating and committing your code then rescanning the ZapCode.
- Quick and easy user testing. There is very minimal set up for the user before they can start interacting with content. The headset does not require any wires so the user experience is completely untethered.
The quality of mixed reality experiences will depend heavily on the way in which users interact, move around, and perceive mixed reality objects against a real world backdrop. The design process will benefit from early insight into the flaws or weaknesses of initial designs via prototyping, testing and iterations. Tools like ZapBox can help provide this insight before developers create detailed designs for more complex and expensive mixed reality hardware.