Cormerica http://www. So many fascinating things! Sat, 28 Apr 2018 01:42:33 +0000 en-US hourly 1 https://wordpress.org/?v=5.3.2 Laser Turret http://www./laser-turret/ Sun, 07 Aug 2016 19:50:32 +0000 http://www./?p=566 Continue reading Laser Turret ]]> Seneca Ridge Middle School students at the workshop
Seneca Ridge Middle School students at the workshop

The Laser Turret project was built for a STEM technology workshop that I organized at the local middle school. “Laser Turret” was the project that was chosen among several options I offered at a talk on computer engineering and technology careers.

Weapons System Design

In order to be cool enough to impress kids, I wanted the turret to have motorized 2-axis aiming to pan and tilt the laser and fully automated aiming and firing. In order to designate its own targets, the turret would need some kind of sonar/radar target scanning and, of course, a working laser, lights and sound.

I wanted to show that we could build this new idea from scratch, so the whole thing would start with an original design that would be 3D printed to make the turret parts.

Components

Nine-gram micro servo
Nine-gram micro servo

Before we could design the turret itself, we needed to choose the electronic and mechanical components that would define its operation. We wanted to try to stick to as simple a design as possible, so that meant thinking small.

The tiny 9g micro-servo is about as small and simple as mechanical output gets! 180-degrees of roboty sound and motion driven directly by one 5v pin.

HR-SR04 Ultrasonic Sensor
HR-SR04 Ultrasonic Sensor

To let the turret scan its environment for enemies, we imagined a scanning sonar solution based on the HR-SR04 ultrasonic sensor. This common starter-kit component is made to sense distance using high-frequency sound echos, but I saw no reason why we couldn’t spin it around and “look” in all directions.

Five milliwatt laser diode
Five milliwatt laser diode

The laser itself is a genuine 5-milliwatt 650-nanometer laser, which is a fun way to not have to say that’s it’s a 35-cent laser-pointer diode.

So that’s one servo for pan, one servo for tilt and a third to rotate our scanning sonar back and forth. Add in one ultrasonic sensor, one serious-looking laser and a handful of variously-colored LEDs and wires and we’re still under $10 so far.

The turret still needs brains, a control system to process the input signals, select targets, align the laser and send those photons down range. A compact weapons package like ours deserves a sleek piece of miniaturized computing power.

Adafruit Metro MiniThe AdaFruit Metro Mini answers the call in  black stealth pajamas. The Metro Mini packs a 16Mhz ATmega328 processor, serial communication and 20 GPIO pins into  its thumb-sized package and looks super-cool doing it.

Design & Modeling

Rather than using an existing design, we had decided to create our turret from scratch. The first step was to decide how the turret would work mechanically, where the servos would go and how they would move the parts of the turret.

Here’s what we came up with.

Finished turret model
Finished turret model

We threw away a number of more complex ideas and settled on a simple design where the laser can be stowed out of view and then pop up into pan & tilt action when an enemy is detected.

Turret Front View
Turret Front View

From the side, you can see the rotating sonar platform as well as the pan and tilt laser weapons platform.

Turret Side View
Turret Side View

The model was built in Blender and refined with several tests using Ken’s 3D printer to ensure the eventual fit of the final parts.

Test parts to refine fit
Test parts to refine fit

Here’s one of the test prints that shows a servo inside an enclosure that served as the model for all the servo housings. The loose square piece was used to test the fit of the servo arm, mounted on the servo. Below the test piece, you can see the final printed arm that was made from the test part’s geometry.

Exploded parts view
Exploded parts view

The design allowed for certain pieces to lock together and for others to rotate against each other.  You can see some of the types of connections in this exploded view.

The weapons array consisted of a large “flash” LED to enhance the firing effect as well as a red LED that would mimic the laser without the eye-damaging laser light. The laser itself would only be active for less than a tenth of a second, but it was enough to mark you with a red laser dot if you were “hit”.

Once complete, the laser turret model was virtually disassembled and the pieces were aligned for the 3D printing of the parts.

3D printing layout
3D printing layout

Programming

Programming the turret consisted of three simple, specialized components. In addition to a master command program, some additional code went into managing the sonar system and the laser platform sub-components.

Programming model
Turret system component model

By delegating target-acquisition and firing to the sub-components, the command program became very simple, only needing to ask the sonar for a target and then handing it off to the firing platform.

Pinout

Connections between the turret and the microprocessor
Connections between the turret and the microprocessor

Once the physical components and the programming of the turret were defined, it was time to look at the wiring for the AdaFruit Metro Mini electronic control system.  For the programming to work, all the servos, LEDs  and other components needed a connection to the microprocessor.

I also created a small operator panel with an activation button and two small status LEDs. This diagram shows how it all worked out.

Assembly

3D-printen turret parts
3D-printed turret parts

The turret came back from the printer as a baggy full of loose parts. Shout out again to Our next step was sorting through all the parts and beginning the assembly.

Printed parts and components
Printed parts and components

Here’s all of our turret parts arranged for assembly.

All we needed was an assembly crew.

Turret assembly crew
Turret assembly crew

And here they are, my daughters Maddie and Cas, who were kind enough (at least for a little while) to put up with a lot of fidgeting with tiny wires.

Turret wiring harness (28 AWG)
Turret wiring harness (28 AWG)

These two images show the connections to the Metro Mini and the completed turret. Even with 28-guage wire, the wire (and its weight/tension) contributed significantly negatively to optimal turret operation. In other words, maybe I should have used even smaller wires… or better yet, a BIGGER turret!

Next time, perhaps.

Testing & Demonstration

Live-fire testing in a classroom environment
Live-fire testing in a classroom environment

Cas and Maddie helped me test the turret in a large basement room. A few tweaks to the programming was all that was needed to start tracking and firing at the girls as they moved around the room.

At the middle school, the kids enjoyed trying to evade the automated firing system, but were quick to exploit the limitations of the platform, such as having everyone attack at once. might be proud, , maybe not so much.

]]>
STEM Workshops http://www./stem-workshops/ Tue, 02 Feb 2016 22:24:09 +0000 http://www./?p=542 Continue reading STEM Workshops ]]> Thanks for attending my STEM presentation!

We talked about the many exciting and fun opportunities that there are in science and engineering careers and we saw how much can be done with with a little understanding how to connect electronic parts together and by using programming no more complex than seen in Hour of Code.

While technology devices are shrinking, the jobs in computers, engineering and science are growing at an astounding rate.
It’s truly an exciting time and careers like those we’ve talked about will let you participate in and shape the next generation of the technology people use every day.

Below are some additional resources that you may be interested in.

After-School Workshop Interests Form

If you’d like to attend the after-school workshop, I like to know what you’d find most interesting. Please click on the form link below to give me your thoughts.

More About the uArm Robot

The robotic tic-tac-toe player is an ongoing project and you can follow all the progress here.

This short video gives a good overview of the vision system.

You can play tic-tac-toe against the actual game algorithm programmed into the robot by clicking , or on the image below.

Challenge the Robot now on the Web

If you are a school administrator or a parent and are interested in more information, please email me.

]]>
Robot Tic-Tac-Toe http://www./robot-tic-tac-toe/ Tue, 02 Feb 2016 21:15:42 +0000 http://www./?p=527 Continue reading Robot Tic-Tac-Toe ]]> Robot-in-Training
Robot-in-Training

 uArm Metal Autonomous Tic-Tac-Toe Robot

Welcome to the project page for the Autonomous Tic-Tac-Toe Robot. I’ll update this page periodically with any new progress if you want to follow along.

Goals

The overarching goal for the robot is to play fully autonomously, like a human. This defines five major components I decided were needed to fulfill that goal.

  • Understand the Game
    The robot must understand how the game is played and have some sense of a strategy for winning.
  • Sense the Board
    The robot must be able to interpret the moves that the player makes and understand the positions of  the pieces on the board.
  • Move the Pieces
    To robot must make its own moves at the appropriate times, using whichever marker has been assigned.
  • Play Autonomously
    All the programming must reside within the robot itself and it should play without any connected computer or external signal.
  • Convey Emotion
    This is a “stretch goal” to see if the robot can convey emotion to the player, based on the status of the game.

About the Robot Arm

The robot arm is a uArm Metal, made by . It was created initially as part of a Kickstarter campaign and is now in production.

The uArm Metal
The uArm Metal

The uArm is a 4-axis robotic arm with three degrees of freedom as well as a hand-rotation axis. It’s based on a design for an industrial pallet-loader and is thus best suited for positioning and stacking items in a 180-degree operating area.

The uArm is controlled by an on-board and is fully .

Understanding the Game of Tic-Tac-Toe

There are lots of resources on the web that will tell you how to play tic-tac-toe and there are many ways to teach a computer how to implement game strategy. For reasons related to the memory available on the microprocessor, I wrote an algorithm based on logical human strategies such as, “move to an open corner such that a block does not result in two in-a-row for the opponent.” The computer must understand both how to translate that strategy into an appropriate move and also when to apply that particular strategy.

The initial version of the strategy algorithm worked so well that the robot was unbeatable and therefore, noted my daughter Cassie, no fun at all.

Challenge the Robot now on the Web

A final version of the robot game-logic incorporates three difficulty levels based on characters I hope to explore further in the emotion section.

You can play tic-tac-toe against the actual game algorithm programmed into the robot by clicking , or on the image of the game board.

Sensing the Board

Computer vision is a computationally-expensive thing and beyond the reach of most small micro-controllers. The robot solves this problem with the from Charmed Labs that employs a dedicated on-board processor for simple object recognition.

This short video gives a good overview of the vision system.

The Pixy allowed me, the programmer, access to a much simpler stream of information where blobs of different color are represented as screen rectangles. This helps a lot.

Moving the Pieces

Moving the pieces was pretty straightforward using the built-in programming of the uArm, but I thought it could be better. I made a number of improvements to the way the arm worked and contributed those changes back to the open-source project where they’re now part of the arm’s programming. Pretty cool!

The video below also shows an example of common engineering math. You’ll find that it’s not really so hard!

Playing Autonomously

Combining the vision system with the robot’s movement system is the next challenge!

The robot’s micro-controller is part of a custom circuit-board developed specifically for the uArm and is therefore optimized for control of the robot. Without access to the input/output pins normally available on an Arduino microprocessor, the options for interfacing the vision system with the robot’s hardware are quite limited.

Thus, I haven’t yet been able to run both vision and movement from the single robot micro-controller. I have some ideas, though!

Conveying Emotion

If you have seem my STEM talk on Computer Engineering or the movement video above, you’ve seen that the robot is capable of expressing at least a few different emotions. I hope to build out this capability once the other issues are solved.

Final Update

Thanks to a comment on one of the YouTube videos, I realized today that it had been more than a year since I promised to update this page with progress. So here’s what happened.

Even if two halves work flawlessly, there will be unforeseen issues when they try to work together. When the only communication channel your onboard computer has is being used for a camera, it is impossible to debug a robot.

My approach was to rig a software serial port off of two unused uArm board pins strung to another ‘debug’ Arduino that would simply proxy data from the uArm to the computer via USB. Once robot debugging was complete, the external Arduino could be disconnected and the robot would run autonomously.

In the end, grounding problems between the Arduinos and glitchy Pixy performance due to its not getting enough juice off the uArm board were enough to ground the project.

I’m more a software guy. When it comes to low-level things like wire protocols and power systems, I want someone else to handle them. I had made each half work, and that was all I needed! 🙂

]]>
BattleDrones Video Game http://www./battledrones/ Fri, 11 Dec 2015 23:11:43 +0000 http://www./?p=483 Continue reading BattleDrones Video Game ]]> The BattleDrones title screen

The BattleDrones title screen

The ever-escalating war of billion-dollar video games has been a boon for independent developers. The two leading video game platforms, recently costing thousands of dollars apiece, are now free for anyone to use. Thanks again, Internet, we may yet forgive you for closing the bookstores!

and , the two competing giants responsible for dozens of the most popular games, are fighting for the minds of developers. They figure (correctly) that the more developers familiar with their platform, the more likely the next winning game will come from their game technology. The “catch” is that you commit to pay them a cut of future revenue, but only if your game is commercially successful.

That meant that I could make a game and pay nothing.

Game Concept

Not too long ago, Markus “Notch” Persson, the creator of , had a fantastic idea for a video game. In the now-defunct concept, players would write computer code to control a spacecraft. Part of the game was that each player had the same number of computing cycles and therefore efficient coding itself was the core game mechanic.

It may seem odd for a game to be about programming, but these days many video games have a giant community of software developers players who write code to modify those games. Often these “mods” are so good that they become de-facto extensions to the game.

Notch’s concept was very grand, but I thought it would be pretty easy to do something like it on a small scale, as a proof-of-concept game demo.

In BattleDrones, the player has control of a computer-driven starship, a BattleDrone, that will carry our humanity’s fight against the evil alien invasion. A Drone operates only via the computer code that the player writes and must be programmed to sense danger and fight autonomously.

Getting Started with Unity

I really had no experience writing video games nor any familiarity with either Unity or Unreal. Luckily, for those who know nothing about something, there is YouTube.

I choose the Unity platform and dove into making some tutorial games to get an idea of how the engine worked. As you may have gathered from this blog, I learn a lot by making mistakes, so jumping right in and making those mistakes as quickly as possible is always a good approach for me.

The first game concept I worked on with Unity, however, was a game called Fork that I started with a developer friend of mine. Fork’s mechanic was manipulating magnetized crates (in a space hangar) in order to progress in the game. While the concept was interesting and we learned a lot about Unity, we kind of hit a dead end and moved on to other projects.

When I began development of Fork, I expected that the there would be tools to, say, put a title here or put a magnet there. I figured one could make objects and add textures to them. You know, make a video game! It turns out that I was quite wrong about how that worked.

Developing a Video Game

While game development tools are very powerful, they are also very basic. If you want to make a cube, well you can do that right there in Unity! If you want a spaceship, however, you are building that in some other software package.

The simplicity of the game-development tools turned out to be more good than bad. One of the most powerful capabilities of a gaming platform lies in its ability to simulate . This allows a developer to basically set physics in motion and let the game engine handle the rest.

For example, a developer can assign mass to a truck model, put it on a road and spin the tires. The game engine then calculates everything from the friction of the wheels to the steepness of the terrain in order to determine how the truck will move over the road. Attach a joystick input to the front wheels and boom, you have a driving game.

Visualizing the algorithm for snapping magnetic boxes together
Visualizing the algorithm for snapping magnetic boxes together

For the magnets in Fork, our work involved telling the game engine how much force a magnet would exert on a body at a given distance and Unity took care of dealing with mass, friction and the movement of the crates resulting from the combined magnetic forces.

Like Making a Movie

Developing a video game is surprisingly similar to making a movie.  Here’s a quick breakdown of some similarities.

Sets

A video game takes place in an environment that serves as the background to the scene. In BattleDrones, the set is the planet and space that surrounds the player in three dimensions. I bought it for a few bucks from a where developers sell all kinds of game assets cheap.

Props

Painted drone model
Drone model with final paint scheme (but no lights)

In a game, you’ll have objects in the scene that the player interacts with as part of the story. In BattleDrones, these are the spaceships that are controlled by the player or by the computer “enemy”. I created the ship model in Blender, the modeling package we used to create the sets and objects for Sector 42.

An early paint test
An early paint test

The ship then needed to be textured to give it a “painted” appearance. You can see that the game has two ship types that differ only by their paint job. Painting was done initially in Adobe Illustrator and then finalized in Photoshop.

The trainer/drone model with its two paint schemes
The trainer/drone model with its two paint schemes

 

Lights

Lighting is almost as big of a deal as it is in film production. Video games use the very same lighting concepts that are used in photography. BattleDrones contains both set lighting of various types as well as decorative lights (“practicals”) that are visible in the scene.

Camera

Again, just like in film production, cameras have focal lengths and depth-of-field, in addition to a number of technical characteristics. In BattleDrones, the camera orients itself around the playfield but in Fork, the player object carries the camera so that you see what the player is pointing towards.

Action!

In video games, action is initiated by the player and is controlled by the game engine. In BattleDrones, the player’s computer code is interpreted by the game engine and “acted out” by the player’s ship. Because of the Unity physics engine, I only had to apply the appropriate force (when the engine was firing, for example) to move the ship.

scan_simulation2
A running game screen with the user-interface overlaid

Sound

Many of the same audio concepts and tools are used to create and mix video game sounds as are used in film and video. Most of the work is in finding or making sounds that fit the game. The BattleDrone sounds were all sourced from , a website where people share captured sounds for free. Need a tea kettle whistle to make a steam-like thruster sound? A jet engine to use for your spaceship ? You will find them all there.

Titles

levels interface
The interface for choosing the training scenario

I’ll cheat a bit here and lump the graphical user interface (GUI) in with titles. The GUI is separate from the physics simulation taking place in the scene and provides the buttons, menus and information outputs that the player sees on the screen.

Play The BattleDrones Demo

Because this is a demo, I’m showing you what I think will be most interesting. In the demo, the code has already been written by the player and you get to just see what happens when the spaceship runs the player’s code.

I see that this has turned into another long post, but I hope you enjoyed a peek into making a video game. You can try out BattleDrones on Windows, Mac or Linux by visiting the link below and clicking the link for your operating system. Once you’ve downloaded zipfile, simply unzip. (There is no install.)

Download BattleDrones!
]]>
Molokai Honey http://www./molokai-honey/ Sat, 06 Jun 2015 22:30:49 +0000 http://www./?p=321 Continue reading Molokai Honey ]]> It was summer in Hawaii and we decided that the family would move to the island of Molokai to raise bees and harvest honey as a family business.

Or, at least that’s what we told the kids right before we put them in front of the camera.

We hope you like our bee movie! The mockumentary style covers up a lot of our errors, and we threw in a few intentional goof-ups along the way. The result, we hope, is a comedy.

Molokai Vacation Location

I never make ideas, I only find ideas. It’s true that we did have a few bees in the pool on our summer vacation and it was my wife Cindy came up with idea of a family honey business mockumentary.

Being on vacation and considering the slower “Molokai time” of the small island, I enthusiastically jumped on board, interviewing the kids before they could figure out what was going on.

I shot the main interview footage one afternoon on our Molokai vacation using only a handheld iPhone 5 with available light. There was really no script or plan for the story at that point, so I chose some provocative questions and hoped for the best.

The Kaneshiro family and Molokai Meli were real, of course, and we did find them only after we began filming — on the Intenet. They were very gracious for not only letting me interview them, but also in sending me a few dozen still images of bees and their hives. While in Hawaii, I also tracked down the producer of a who was kind enough to let me use his secondary bee footage that you see in the cutaways.

Story Development

Back in Virginia, I picked through all the vacation footage and started piecing it together into a kind of narrative.

I had begun to build a story around the best of the kids’ reactions, but it was a bit flat and disjoint. The kids’ answer to the various interview questions were pretty good, but the story needed some kind of structure to tie it together.

Basement Set
Dad’s set, Basement Studios Inc.

Rather than using a narrative voice-over, I wanted the film to be of the style where the interviewees tell their entire story, prompted only occasionally by the interviewer. While Mason, Cassie and Maddie were completely unscripted in Hawaii, I had some direction in mind when I filmed Cindy that fall back in Virginia.

Hailey's Set
Hailey’s Room set, courtesy: Guest Room

The Virginia shoots were much later in the season and we were lucky to capture Cindy’s video outside before the weather turned gray and we would have to use interiors and need good lighting. That winter, Hailey and I wrote scripted scenes for the two of us around indoor settings and created some sets that would round out the story.

Post Production

The most fun on this project was getting to learn a number of new production techniques. Here’s a brief overview of some of the things we did.

Color Correction

Primary color correction is the work to adjust all of the visual assets (footage and stills) so that they have the proper white & black levels as well as good, balanced color. Despite my efforts to set exposure and white-balance in Hawaii, I found that not only did light levels and color shift between interview locations — but sometimes even between questions with the same subject in the same location! We learned how to correct for this as best we could to make the shots look similar.

I also found that some subjects would blend into the background too much, tempting the viewer’s eyes to wander around and lose their attention to the scene. For these, I created vignettes by subtly dimming the edges of the shot in order to highlight the subject.

Image Color Progression
Color correction progression

We also did some secondary color correction to tone-down bright or distracting things in the background and, at the end of it all, added the overall color grading that gives the film the brownish Molokai cast.

Cassie color correction
Cassie with secondary correction for the bright white bench
Image Stabilization

Because I had shot the Molokai interviews using a handheld camera, the shots looked pretty awful bouncing around as I posed my questions. I used some video tools to stabilize these shots, giving the look of the camera mounted on a tri-pod. The ravine pan shot is a great example of what was originally a very shaky handheld video. If you want to see how the raw interview frames looked before stabilization, take a look at the post-credits clip of Mason.

Sound
Hallway Set
The hall outside Hailey’s room, dressed with Hawaiiana

There was a lot of sound work. While the Molokai location sound was captured in-camera, all the “studio” sound was recorded separately from the video using lav and shotgun mics that were synced or mixed during the editing process.  All of the off-camera voices, including Hailey’s behind the door, were recorded separately and mixed in later.

While about half of the bird sounds are real, the other half were added from unused Molokai footage to expand the sound field. I even did a small amount of for the sound of my slippers as I follow Hailey into “her room,” which was actually our basement gym.

All sound clips at a minimum had to be converted to mono, cleaned of noise and compressed in dynamic range. Even before music was added, the audio project  consisted of over 100  clips in 12 tracks of audio.

Audio mixing with Joe Ross
Finalizing the audio track with Joe Ross

Joe Ross, the , did an incredible job tweaking every audio clip and blending the tracks together into the final mix. I thought I had done a pretty good job with the rough tracks and was amazed at how much he was able to improve the overall composition.

Music

Music rights are a funny thing. There are many many things you can get away with on YouTube, but using other people’s music is not one of them. I found out how good YouTube is handling music rights when I first uploaded Sector 42 — The Space Movie. Within 30 seconds of uploading, YouTube had computationally identified some of the music as having had a rights claim against it.  I did have a license from the music publisher and so I had to work with them directly to confirm this to YouTube before the video was released.

Finding and licensing  Hawaiian music for this project proved much more difficult. After many hours on the Internet dealing with inflexible licensing companies wanting hundreds of dollars per song, I finally found . They had some great original Hawaiian music and were kind enough to license me everything I needed for a fraction of the cost of just one of the more traditional tracks.

Goof-ups

We learned that a simple iPhone can capture some astoundingly good high-definition footage. We also learned that when the older iPhone 5 is coerced (via ) to capture raw frames of video in the hot sun, it will overheat and completely lose it’s audio sync. Also, bring a tri-pod.

I completely screwed up Hailey’s shoot by recording the entire thing at 30fps, rather than the film rate of 24fps we were using. This turned out to be an almost unsolvable problem for reasons I didn’t initially understand and fixing it was an interesting but longer story that I’d love to tell another time.

I also learned that if parts of your image are so bright that they exceed the camera’s ability to record them properly, they cannot be fixed in post-production. This ruined some additional pool and bee shots I would have liked to include. Some of Mason’s shots couldn’t be properly light-leveled and appear much too bright as a result. A simple bed sheet to diffuse the sunlight would have fixed this.

We removed dozens of squeaks, hums, and car sounds to make the locations sound more pristine. We shut off furnaces and refrigerators back in Virginia to get cleaner sound. We used blankets and pillows to make a recording studio and learned that it’s virtually impossible to get rid of location wind noise that spans the vocal range.

Lower-Third Caption
Lower-third caption graphic

We removed fingers that obscured the camera lens, re-framed shots, restored “lost” audio and, in general, experienced all the newbie mistakes in making a movie.  Along the way, we also made graphics and titles and managed to meet a lot of interesting people.

All in all, another great family project!

]]>
Maddie’s Travel Animation http://www./maddie-travel/ Fri, 06 Feb 2015 20:54:37 +0000 http://www./?p=392 Continue reading Maddie’s Travel Animation ]]> Westward Ho!
Westward Ho!

Maddie and I made an animation for her History class! The presentation was about the westward expansion of the United States and Maddie did the all the writing, storyboard and voice-over for the project.  We only had a week to build it and had a great time doing it together.

If you’ve read any other posts on this site, you already suspect that we overdid this project. If so, you’d certainly be correct and, in fact, this was probably a bit worse than usual.

Travel Title - Pasture
Raising cattle was easier than growing crops – the American burger was born

Maddie asked me one evening if we could build an animation for her History project on the westward expansion of the United States. She had a script written and a bunch of ideas. I was taking a break from the production of the Molokai Honey documentary and so it was perfect timing for a new project.

Maddie’s original idea for her TRAVEL theme was a westward moving scene with various vignettes showing the key concepts of her narrative. I outlined what was possible and what I thought we could accomplish in a small week-long project.

Construction

I had wanted to build an animation showing a effect of motion, where the foreground object move faster than the background objects as the camera moves. I threw together a quick demo of a mountain scene and was showing it off when my older daughter Hailey reminded me,

“Dad, this is supposed to be Maddie’s project. You need to let her do it.”

She was right, of course, so we reset and went back to (Maddie’s) drawing board and started from there.

Over the course of the project Maddie created the storyboard and design sketches, made all the timing and action decisions, recorded sounds effects and and, of course, she recorded the voice-over.

It was very dry in the great plains
It was very dry in the great plains

Dad did do the animation, I will admit it. Plus I helped on some of the sounds, like the lower pitch of the thunder and the male .

We wanted to make the mob of Exodusters sound like a real thundering crowd of people so Maddie and I recorded three takes of ourselves murmuring and beating on the bed in the guest room. Once we layered these recordings over one another, we had our rampaging mob.

It was a great school presentation and everyone loved it, though the teacher did suspect that Dad may have helped a bit.

Reconstruction

Free land grants encouraged many to move west
Free land grants encouraged many to move west

After the school presentation, there were still a few things that Maddie wanted to add to the animation that we weren’t able to fit in before the original deadline. *Over the next few weeks month or two, she and I spent an hour here and there adding her final polish to make the version you’re seeing here.

These enhancements included the music, the changing sky and ground colors, the setting sun and the popping-up of the houses at the end. We both hope you enjoy it!

]]>
3D Printing A Spaceship http://www./3d-printing-a-spaceship/ Tue, 03 Feb 2015 00:52:52 +0000 http://www./?p=410 Continue reading 3D Printing A Spaceship ]]> The Artemis transitions from virtual to physical
The Artemis transitions from virtual to physical

Today at work, I printed a spaceship!

One of the best things about my job is that I’m surrounded by tremendously smart engineers that fully appreciate the benefits of geeking out. Over the history of our small start-up, my colleagues have brought all manner of their creations into the office including 916福彩下载-built telepresence robots, processor-controlled Christmas decorations and a 916福彩下载made quad-copter drone that was so good that it was able to crash into a tree way over on the far side of the parking lot.

But this week, we were the kings of Nerdville when Steve brought his new 3D printer in for all of us to play with.

The Ultimaker 2 3D Printer
The Ultimaker 2 3D Printer

The is a fine-looking machine and considered to be the best consumer-level 3D printer available today. 3D printers work by squirting out a stream of hot plastic from a 2-axis print head to make a thin later of material, much like a standard printer. Above you can see the blue plastic wire material that is fed to the print head from a spool on the back of the printer.

The 3D part happens once the first layer of plastic is laid down and the print head starts printing the next layer on top of it. Over the course of a print, the floor of the printer moves slowly downward and the many thin layers of hot printed plastic build up to form a 3D object.

The model I used for the print was the spaceship design we had created for Sector 42, the sci-fi movie Ken and I made with our daughters.

The virtual Artemis exists only in computer bits...
The virtual Artemis exists only in computer bits…

Up until now, the ship had only ever existed inside a computer but today, we would turn our creation into a physical object we could hold in our hands.

Preparing to Print

In order to print the Artemis, the computer model had to leave the glamorous world of motion pictures and get ready for the real world.

After I’d stripped the virtual model of paint, lights, engines and attached cameras, I closed the various holes in the hull where the engines sat and the bridge window poked through the font. Once completed, I had a 2-dimensional “watertight” manifold, as required by the printing software.

The computer model then moves from my original creation and photography software into the that prepares the model for 3D manufacturing. This program cuts the model into tiny slices and maps the print head path over them. It also estimates the distributed weight of the material and creates an internal honeycomb of structural support inside the outer “shell”. as well as any external support structures that may be needed.

We decided that the Artemis should be oriented head-down to better distribute its mass, which required the ship to balance on its nose while being printed.

Slice 299 of 1150
Slice 299 of 1150

In the above image from the printing prep software, you can see a single slice of the print plan showing the internal support honeycomb in yellow, the path of the print head in dark blue and the tiny aqua-colored external support structure at the bottom that kept the whole thing from tipping over during printing.

Printing

Once the software had generated the plan, it estimated that the final print would take two hours and thirty-six minutes to print. When completed, it would be 12cm long and 14 grams in mass and my children would be delighted.

In the 2-minute video above, you’ll see a time-lapse of the the two prints we made of the Artemis.

 

]]>
Harmony–Explorer http://www./harmony-explorer/ http://www./harmony-explorer/#comments Thu, 11 Dec 2014 12:53:27 +0000 http://www./?p=330 Continue reading Harmony–Explorer ]]>
Click to explore your own harmonies!

Why is it that some chords progressions sound much better to our ears than others? Why does every kind of music music from Beethoven to Jimi Hendrix employ familiar chord patterns? Harmony Explorer is a JavaScript application I built that helped the kids and I understand some of the secret underpinnings of music!

We came 916福彩下载 one evening after attending Mason’s high-school choral concert and were talking about music. Mason has been studying the guitar (and the piano) and had created some simple songs of his own. We got to talking about how certain chords seemed to fit together better than other chords and how it was that these same chord patterns seemed to show up in song after song.

Many simple guitar songs are played with some combination of (C, F, G), (G, C, D) or (D, G, A) chords. When a song established one of these patterns, it would almost never include one of the “foreign” chords from the other sets and would stick with its chosen group of three. Also, it seemed like the B and E chords almost never got “picked.”

Many simple guitar songs use the same chords
Many simple guitar songs use the same chords

Each song chose a set of three chords, and that was it. This suggested that it was the relationships of the chords that mattered and not the chords themselves. But what was it about the chords that were chosen that made for better music?

Harmonies

The answer is harmonies, specifically certain harmonies.  Harmonies are specific chords which are a set of three or more notes, defined by the lowest note of the set. When you choose the first, third and fifth note in the musical scale defined by the lowest note, you have the most common (I) harmony.

We’ll simplify this by looking at an example in the key of C major.  C major makes this easy because the white keys represent the notes in the scale and the black keys are the notes not in the scale. If we start with C as the first note and then choose the  third and fifth keys from there, we have a very-pleasant sounding C major chord:

Notes 1, 3 & 5 in the C major scale
Notes 1, 3 & 5 in the C major scale

You may have heard the terms third and the fifth in regard to musical intervals and that’s what these two additional notes are, so named because they are the third and fifth notes of the scale. This 1, 3, 5 pattern works if you start with any key, but it’s not so obvious in other keys because their scales include the black keys, making it hard to see the relationships. It also works for minor keys so long as you pick the first, third and fifth notes of that minor scale:

Notes 1, 3 & 5 in the D major & D minor scale
Notes 1, 3 & 5 in the D major & D minor scale

So we know how to make the basic I harmony for a given key, but we still need to understand how to find chords that sound best when used in combinations. This, finally, is where we get to harmonies.

Harmonies, really this time…

Creating harmonies is simple. If you have your fingers on the three notes of the basic I harmony, move all your fingers up one note on the same scale and you have the ii harmony. Do it again and you get harmony iii and on up to vii.  As before, this works the same way in all major and minor keys though a C major example makes it easiest to see:

The first three harmonies of C major
The first three harmonies of C major

This next part you just have to accept, kind of like a law of the universe: harmonies I, V and IV sound awesome together in pretty-much any combination, no matter what key or major/minor mode you choose.

The reason why this is would take a much longer blog post and has to do with frequencies and math so instead we can simply accept this law and use our new insight to re-draw our original guitar chords picture.

Those guitar chords were chosen because they're the primary I, IV & V harmonies
Those guitar chords were chosen because they’re the primary I, IV & V harmonies

Knowing what we now know about harmonies gives us new insight into how those guitar chords were chosen. In each of our song examples, the chords chosen were the I, IV and V harmonies and this shows us that both Ludwig and Jimi were up to the same thing.

Harmony-Explorer

is a web application that lets you explore these concepts yourself and see the notes displayed on a keyboard. You can also create your own harmony progressions and record your compositions for playback. Click on the colorful diagram and take the “suggested” routes I drew — or follow your own path!

While the I, IV and V harmonies are the strongest and most often used, the other 4 main harmonies (ii, iii, vi, vii) certainly also have a place in music. There are also many secondary harmonies that borrow notes from other keys. When used in various progression patterns, these secondary harmonies can add or release tension and evoke emotion that the simpler primary chords cannot.

Some general guidance on exploring harmonies can be found under the ⓘ icon in the upper-right corner of the application. You can load some sample harmony progressions from popular songs by clicking the Samples button.

Click to explore your own harmonies!

Application Development

Harmony-Explorer is written in JavaScript and CSS3 and incorporates the library developed by . supports the icon images. All modern browsers should be supported, including mobile. I’ve open-sourced Harmony-Explorer on so if you’re a developer, you can get the code there.

]]>
http://www./harmony-explorer/feed/ 1
Motion Graphics – thejbo goes to Florida http://www./motion-graphics-thejbo/ http://www./motion-graphics-thejbo/#comments Mon, 06 Oct 2014 01:18:58 +0000 http://www./?p=301 Continue reading Motion Graphics – thejbo goes to Florida ]]>

My friend at work, who goes by “thejbo”, is the guy who builds and maintains all the servers that host our company web application. He’s moving to Florida to work remotely and I made this motion graphic to commemorate his leaving the office. Really though, it was an excuse to learn more about After Effects and to work with my daughter Hailey. We had a great time making it!

Several people have asked me what went into creating the composition so I’ll include a quick overview of the various parts here in this post.

Overview and Script

I wrote the script in a quick single draft between two meetings at work. (I’ve learned that when the muse visits you, you should take notes right then and there.) That original script really didn’t change much over the course of the eight weeks of evenings it took to complete the project.

After Effects

The assembly and animation was done in Adobe but the project relied heavily on three additional creative tools which I’ll outline below.

Below you can see the 33 layers that made up the After Effects composition.

thejbo's After Effects composition
thejbo’s After Effects composition

The animation is based on a number of techniques I’d recently studied and wanted to try out.

— This basic technique of moving the camera around to expose typography opens the animation, but I moved on pretty quickly. provides a great tutorial of text-based motion graphics or “kinetic type” techniques.

— This is a great way to understand After Effects 3D layers, parenting transforms and layer masks.  presents this technique and is a great source for all kinds of project ideas and utility scripts.

— The animated “hipster badge” at the end of the video is another technique I got from Mt. Mograph and also used in my recent Hipster Swag Badge motion graphic.

The rest was basically just coming up with animations to match the script.  I had built the simple graphics for the “server stacks” sequence from scratch, but soon realized that I was going to need a lot of very specific graphics for the remaining scenes.

Illustrator

was used to create the 18 vector art assets used in the full animation. Because I have neither the talent to create, nor the money to buy professional graphics assets, I simply found watermarked (sample) clip art that I liked and used it as a guide in recreating a simplified vector version in Illustrator. This is probably a “gray area” sort of activity but I figured hey, I wasn’t going to be monetizing it.

Stock art on the left, my version on the right
Stock art on the left, my version on the right

Photoshop

Two of the graphics, the guard kittens and the figure of thejbo, were created in because they were originally photographs. I created the kitten guards from some Google images and thejbo let me take a picture of him that became his waving figure.

Once the image assets were masked, I used a technique I got from to create the two-color posterized effect that incorporated the palette colors and helped the photos look more like the surrounding graphics.

Original photos on the left, final stylized graphics on the left
Original photos on the left, final stylized graphics on the left

Color

Determining the color palettes for each scene and how they fit into the overall composition was perhaps the most fun part of the project. I got to work with my daughter Hailey who provided a lot of input into the final colors sets.

Illustrator has a great tool called Recolor Artwork that will show you all the colors used in a composition and where they sit on the color wheel. Using this, you can play around with color harmonies and adjust color values directly in the artwork simply moving color handles on the wheel. Pretty awesome, actually.

In the images below, you see the master sheet that we used to define all the color palettes in the animation.

Each of the main scene elements and their palettes
Each of the main scene elements and their palettes

The opening scene uses a very narrow color harmony. In the map scene that comes next, the original palette replaced by a pentagonal harmony that spans the entire color space, to give a feeling of change and expansion.

The palette of the last scene still uses a broad color harmony, but it’s limited to half the color wheel and is a bit darker overall. At the very end, the color elements are removed, leaving the original starting palette.

Audio

The audio was produced in and incorporated 9 tracks and 30 individual sound elements including drums, bells, whooshes, music and sound samples in addition to the voice-over track.

Audio design layout
Audio design layout

I recorded the voice-over in our spare bedroom with a Zoom H2n audio recorder and then edited together the best of the various takes using Audition. From there, I did some equalization, normalization and compression on the track before adding a 10% pitch reduction and some light reverb.

The sound effects and music were a mix of samples from Video Copilot’s as well as sounds and music from license-free sources like .

Using the video from After Effects as a guide, I added and mixed these sound elements to create the final composition.

]]>
http://www./motion-graphics-thejbo/feed/ 3
Hipster Swag Badge http://www./hipster-swag-badge/ Mon, 22 Sep 2014 21:43:04 +0000 http://www./?p=295 Continue reading Hipster Swag Badge ]]>

This is a quick motion graphic I created give hipster props to Kalimar, one of our developers.

The original graphic was created in Illustrator and then brought into After Effects for the animation.

You can learn these same techniques I used by watching these two great tutorials:

  • Mt. Mograph —
  • ECAbrams —
]]>