Posts

Showing posts from 2017

Machine Learning -- Neural Network

Image
There's an ever increasing amount of excitement in the industry surrounding Neural Networks, which seems to create a race to figure out various ways of using them. This is great, I love AI, and Neural Networks are not new, actually I believe the concept dates back to the '80s. However, I'm starting to notice a disturbing trend that a Neural Network is being treated as a magical solution, the excitement seems to be creating hordes of documentation that fail to describe what a Neural Network is and only seeks to show how to use one for a specific problem. Thus, I'm accelerating my personal schedule and writing this quick high level intro to try and fill in the gaps I'm noticing in documents/tutorials out there. I'll try to keep this really high level and talk about concepts, and the desires behind each step of the process. Thus I hope everyone that reads this will be able to remove a bit of the mystery surrounding Neural Networks. As everything having to do

Status of the blog

I have some news, Qualcomm is no longer running a developer advocate program. However, I've completely enjoyed activities such as giving lectures, attending meetup's, working with other devs on their projects and in general answering questions. I've also really enjoyed working on this blog. So for the foreseeable future; I will continue this blog and continue the ongoing projects. However, as I am no longer a Developer advocate for Qualcomm, my posts will not seek to specifically highlight Qualcomm ecosystem. If there is a better solution, I will highlight that instead. I intend to make this blog a resource for developers looking to work in embedded programming. I also will highlight AI and ML. Over time, I might start to introduce some news and thoughts on what new technologies mean by comparing them with their competition. If this blog ceases to be fun for me, or if it proves to never be popular enough to keep it going, I'll cease it. For now, however, I hav

Drone Controller Software -- Part 1

With looking at the last post, I'm detailing out 4 main modules. The largest of the modules is the Drone Software module. This module is the executable that actually runs on the drone itself. This post will detail the architecture of the drone executable module; along with how to get started writing it. First, let's talk about how to get navigator libraries to work in Android Studio. Remember that we're linking in a dependent pre-built library. I suggest using CMake as it has strong integration within Android Studio and is well supported. To use CMake, create the project as detailed here . Now, edit the CMakeLists.txt file and add the library dependency like so: include_directories( ${CMAKE_CURRENT_SOURCE_DIR}/snav/1.2.31_8/include ) add_executable( drone-controller ${SourceList} ) add_library( libsnav-arm SHARED IMPORTED ) set_target_properties( libsnav-arm PROPERTIES IMPORTED_LOCATION ${CMAKE_CURRENT

Drone VR Software Architecture

Image
Finally we're at the point in our opus VR drone project that we need to look at the software. If you're following along, I have the Hardware drone built, using Snapdragon flight . To review, we have knowledge of how to create a project in Android Studio that can target Snapdragon Flight for deployment. We know how to utilize the Daydream controller to build an arm model . We also know how to setup a VR project in Android Studio . Finally, we know how to communicate with the Snapdragon flight controller. So, if we put it all together, we only need to figure out how to program the controller aspect of the Flight. Thankfully there's rich support available for Flight that allows us to have full C-API control over both the ESC and the controller board. It is provided through the navigator api library . Now let's talk about the overall design. I envision four main modules: VR Controller App interfaces with the user gets positional updates and issues commands de

Machine Learning Part 1 -- Linear Regression

Image
In working with the Drone project, I ran into an issue. It turns out to be really hard to control the drone while transitioning from/to VR. That is to say, when the drone is in the air, flying it in VR is really comfortable and easy. However, getting it to the air and then transitioning to VR mode is a challenge. Also, what happens when the VR mode allows the drone to do something that would allow it to crash. There's two potential solutions to this problem. 1.) I could advocate for having a buddy control the drone and keep it in stable flight allowing manual switch over for when ready for VR control, or 2.) I could advocate for writing some AI to allow the drone to control itself. I'm Opting for #2 for a few reasons, first I'd rather be able to fly the drone whenever I wish whether a buddy is available or not. Also I'd like to avoid creating multiple control capabilities and then require a certain skill level to control my drone. Finally, I'm a programmer,

Soldering new connectors for the drone

Image
If you're with me so far, in our Snapdragon Flight VR build, you might have noticed that the connectors that come with the motors do not match the connectors that come with the Snapdragon ESC. This means we need to replace the connectors so electrons can flow and power the motors. Remembering that the whole point of this is to get electrons to flow will make the concept easier. Copper is a conductive material, that's what wires are made up of. Solder is a metal material that is highly conductive with a low melting point and solidifies fast. So the idea is we want to use the molten solder to join the wires to the new connector. I personally favor damaging a cheap motor to damaging an expensive ESC, so I opt for doing the work on the motor connections themselves and highly recommend this path. While the minimum you'll need is 4 connectors, I am really good at making mistakes, so I wound up getting 10. The first step, is to cur the old connector off. Grab a pair of w

Build Snapdragon Flight Drone

Image
In order to have a VR Drone, we must start building a real physical Drone. Snapdragon Flight offers a great light weight powerful platform we can do a lot with. While most home built Drones are built with very custom specs; I felt it necessary to make this very approachable by ensuring the Drone is built exactly as the specs outline. Our's looks slightly different, but hey Drone time! Let's start with the parts the Drone needs: Allen Wrench Hex Key Set Dynamite 5 pc Metric Nut Driver Assortment Generic M2.5 Nylon Hex M-F Spacer/Screw/Nut Assorted Kit Loctite Heavy Duty Threadlocker 2x clock wise motors 2x counter clock wise motors ac converter for battery charger battery charger Battery ESC (to control the 4 props) Frame Kit gps antenna 4 x DF3-3P-2DSA connector Wifi antenna Grey props: x2 Red props: x2 Before jumping straight into working with the hardware, a small house keeping task must be preformed. There's a small problem, If you have big

Connect to a USB Serial Connection

Image
In order to connect to a USB Serial connection as is the case in connecting to Snapdragon Flight , there's slightly different ways for each host operating system. I broke this up to a separate blog post to provide easy reference for any other project. I'd note that these instructions work equally well on Intel's Edison boards which also require USB Serial connections. If you're trying to connect from windows, here's some steps for you: Run putty Now configure Putty as follows: Click Serial under Connection Type In the Serial line field, enter the COM# for your board, such as COM12. NB: If you did not identify your COM# when setting up your board, navigate to the Device Manager and check for an entry called USB Serial Port (not Intel Edison Virtual Com Port). The COM# is displayed next to the USB Serial Port entry, as highlighted below. Give the Speed field a baud rate that seems appropriate: 115200 Your configuration screen should look like this: C

Build custom arm model for VR

Image
It's time for some fun. An awesome use case in how to use quaternions to achieve portable results. If you aren't familiar with quaternions, please feel free to review them in my previous article. If that's confusing, it's no problem, you can use the code from below, but it might be a bit tough to understand without the prerequisite math knowledge. So let's write the inspiration: Notice that controller? One here too! In the world of mobile VR, the two main systems, Google's Daydream and Oculus' mobile GearVR, are both now shipping with a controller that is used much like a laser pointer in the virtual world. While both Google and Oculus provide an SDK for working with the controller, much of what it takes to work with it is missing in any form of documentation. Thus most apps rely upon other people's controller implementation and create a project wide dependency (Unity, Unreal) just for one very small aspect of the application. Also, my project

Quaternions

Image
As I'm starting to put together the drone VR project, I've noticed that I've been remiss in explaining a facet of programming we're going to be depending upon; that while well understood among mathematicians, is viewed as too complex for most programmers. One of the many tasks that we often face in VR or game development is a need to know orientation. That is how to rotate objects so they appear oriented in a manner that we have planned. Now the easiest way to handle rotations is to use Euler angles. That is to say imagine a 3d cartesian coordinate system centered at the center of the 3d object we want to rotate. The straight forward way of doing it, is to say rotate it around the each axis a certain number of degrees in 3 separate steps. The above shows how a Gimbal works. They have the properties of being extremely easy to understand while also being susceptible to a very problematic detail. If one dimension rotates too close to another, then both dimension r

Build executables for any ARM device

Android Studio has given developers a new IDE to create apps that work in Android. Now with the recent addition of CMake integration, it has also given us Embedded Developers something a bit more. We now have an IDE that can target not only Android but any device that runs linux on a supported ABI. Shock, horror, no, it's good to react with eagerness and let our inner hacker out. When building apps for Raspberry Pi, Snapdragon's family of boards, or any embedded device running linux, instead of creating a cross compiler; we can use the one that Google provided. In this blog post, I'm going to dedicate us to the lofty goal of a traditional Hello World compiled from Android Studio, and run on a Snapdragon 410c . It's simple really, so let's get started. 1st, create a module or new project, then get Android Studio to use CMakeLists.txt. From here, we need to start doing things that aren't part of the documentation. We need to create an executable. Howeve

Android NDK with Google Test Part 2

I recently gave a presentation on using Google Test within Android Studio; Here's the slides I used. Thought it would be easier to see the code this way: The awesome thing about this is it is highly useful for the upcoming Snapdragon Flight drone controlled by Daydream running on a Snapdragon 820 powered Pixel XL. Being able to automatically determine if the various parts are working is essential for any complex program.

Unboxing the Snapdragon Flight

I got my hands on the Snapdragon Flight Development kit.  Thought I'd start with it by an unboxing video; this is what the kit comes with.  This drone project will be tied into the other work I've been doing by making a VR app that controls it.  I'm going to start doing a few videos along the way to give better access to what I'm doing and how.  I'll also open source the entire project.

Profiling a NDK app from a CI server

As we get into working in VR, one of the chief toolsets that's missing from our developer toolbox is profiling.  The problem is that VR has a hard CPU / GPU requirement which is not so forgiving to unoptimized code.  So every time a team member checks in code, we really must ensure that the app still performs at least at 60 fps or our end users will wind up sick. As I've mentioned before, I prefer Jenkins for a CI server.  Now something we can do is start recording the systrace and run a monkey test behind it so the app is doing more than just running.  To do that, create a new shell task in Jenkins and do something like this: adb shell monkey -p your.package.name -v 5000 > monkeyResults.txt This will run 5000 random events that an end user might run back to back.  The nice thing about it is it generates random user input and runs any app in an unpredictable yet repeatable manner.  It looks for crashes from unexpected user interaction and we can use Jenkins to parse the