Thursday, September 27, 2018

Sensor Placement in Photography and Racing Drones


Sensor Placement in Photography and Racing Drones

Sensor placement is one of the most crucial elements of unmanned system design, and
the purpose of the unmanned system dictates exactly where and how critical sensors are 
placed.Quadcopter drones (the four-rotor helicopter drones that have become 
ubiquitous in electronics stores) all look similar from a distance, but vary wildly 
in their sensor placement depending on their purpose. Two such unmanned systems 
are photography drones and FPV racing drones, which superficially look similar 
but are actually worlds apart in both purpose and sensor placement.
           
DJI is one of the industry leaders in hobby and commercial photography drones, 
and their Phantom 4 drone stands at the top of their professional-level UAV 
catalog (Mellors, n.d.). According to Mellors (n.d.), the chief feature of the 
Phantom 4 is the 4K camera, which enables the user to record ultra-high 
resolution video. Below is an image of the drone:

This is a great example of the drone’s camera placement. It is mounted on a 
gimbal on the bottom of the craft, which gives it the ability to both rotate and 
pan up and down.  This allows the camera to adjust to a wide variety of 
positions while the drone is hovering in place, giving the pilot the ability 
to capture a wide variety of images and angles without having to reposition 
the drone. The gimbal also compensates for the UAV’s movement in the air, 
enabling the camera to record stable footage even if the craft is moving 
(Mellors, n.d.). Sensor placement is key in the DJI’s functionality as a
photography tool.
            
Camera drones aren’t the only quadcopters in the skies; first-person view (FPV)
racing drones are also becoming popular. FPV drone racing is a growing sport 
where competitors fly their drones through pre-built tracks (Drone Enthusiast, 2016). 
The “FPV” is the key term here; operators utilize displays which show a first-person 
view from the drone, almost as if the pilot is sitting in the drone’s cockpit. The 
FPV requirement dictates very specific mounting for the racing UAV’s camera, 
namely its nose:

This is the Walkera Rodeo 150 FPV racing drone. Just like the DJI Phantom 4, 
it is a quadcopter, but that is where the similarities end. The Rodeo is a lightweight 
craft; it only weighs a hundred and fifty grams, and is designed for indoor and 
outdoor racing (Brown, n.d.). According to Brown (n.d.), it is also designed for 
aerobatic flight.
            
Since this is an FPV racing drone, the camera is mounted to the UAV’s nose,
centered in such a way as to simulate a cockpit view. According to Brown (n.d.), 
the camera’s position is fixed in place, with the optics allowing for a wide-angle 
view of one hundred and ten degrees. As Brown (n.d.) points out, this is a suboptimal 
position for photography, which removes the drone from consideration for serious 
photographic work. However, in its intended use as an FPV viewfinder, the drone
works very well, with the UAV’s electronics transmitting the camera’s
data to a viewing device with almost no lag.
            
Sensor placement is a critical element of UAV design, and the drone’s purpose often
decides where key sensors are placed. The most important sensor on both a racing 
drone or a photography UAV is the camera, although camera placement varies wildly 
from one platform to another. The DJI Phantom 4 carries its camera on its belly in 
order to allow for maximum photographic flexibility; the Walkera Rodeo 150 carries 
its camera on its nose, in order to give its pilot the best possible view when careening 
through a racetrack at high speeds. Both drones work well in their intended roles, and 
sensor placement is key to their successful function.




                                                               References

Brown, J. Walkera Rodeo 150: a compact FPV quadcopter for racing. Retrieved from
    
     http://mydronelab.com/reviews/walkera-rodeo-150.html on 27 September 2018.

Drone Enthusiast (2016). FPV drone racing – the UAV sport about to hit the big time.
     
     Retrieved from https://www.dronethusiast.com/fpv-drone-racing/ on 

     27 September 2018.

Mellors, J. (n.d.). DJI Phantom 4 review for photographers. Retrieved from
     
           
           





Wednesday, September 19, 2018

                                                       US Navy Bluefin-12(D)

In November of 2017, an Argentinian Navy submarine named the ARA San Juan
disappeared while traveling from an Argentinian naval base in Tierra del Fuego to its home port
in Mar del Plata (McKirdy, 2017). According to McKirdy (2017), the submarine, if underwater,
was in danger of running out of oxygen in as little as seven days. The vessel’s disappearance
sparked a frantic search from both the Argentine Navy and its international partners.  One of
those partners, the US Navy, deployed four unmanned underwater vehicles (UUVs) in support of
the search (Werner, 2017). According to Werner (2017), one of those vehicles was the Bluefin-
12(D).

The Bluefin-12(D), manufactured by General Dynamics, is a torpedo-shaped, highly
modular UUV capable of diving to a maximum depth of fifteen hundred meters (General
Dynamics, n.d.). According to General Dynamics (n.d.), its primary prioproceptive sensor is an
inertial measurement unit (IMU), which internally monitors the UUV’s speed and pitch in the
water. General Dynamics (n.d.) also states that one of the craft’s chief exteroreceptors is a
doppler velocity log (DVL), which measures the doppler effect created by the UUV as it
travels through water and then uses that information to confirm the IMU’s data. Rounding out
the suite of navigation sensors are a GPS and a compass, all of which aid the Bluefin in
navigating deep water.

The US Navy made an official statement that the Bluefin-12(D) was being utilized in the
search, but did not specify what search and rescue-specific modifications were made to the craft
(US Navy, 2017). However, a quick look at the Navy’s official photo of the Bluefin shows that it
is missing one modification that I think would greatly improve the craft’s ability to carry out its
mission: a manipulator arm.

                                      Bluefin-12(D). Source: US Navy Public Affairs Office.

I believe that a manipulator arm would greatly increase the Bluefin’s effectiveness, as it
would give the craft the ability to move debris and remove obstacles, thereby clearing a path for
manned rescue efforts. While not specified in the craft’s spec sheet, the Bluefin’s modularity
indicates that it can be mounted with a manipulator arm without too much difficulty.

Unmanned aerial vehicles (UAVs) can be combined with unmanned surface and
underwater vehicles to maximize search effectiveness in maritime environments. In 2017, a
research team from the Air Force Engineering University in China conducted a study that
determined that UAVs can be far more effective than manned aircraft in maritime search-and-
rescue operations (Lei, Jianbo, & Shukui, 2017). According to Lei et al. (2017), the ability of
UAVs to maintain sustained operations while providing both automatic and manual scanning
greatly increases the odds of rescuers finding lost personnel. By utilizing both a UAV and a
UUV, rescuers can systematically search large areas of water for longer periods of time (and at a
much lower cost) than by utilizing manned craft for the same purpose.

UUVs have many advantages over their manned counterparts when it comes to maritime
operations. They are generally far less expensive to operate than manned craft and present no
danger to pilots or crew while operating (McPhail, 2002). They are also small and compact in a
way that manned craft cannot be. The Bluefin-12(D) is small enough to be maneuverable
in tight spaces, a potentially crucial ability when attempting to navigate wreckage. A craft the
size of the Bluefin is far too small to accommodate a human crew.

The story of the Argentine submarine does not have a happy ending; the US Navy called
off its search for vehicle six weeks after it disappeared (Chaplain, 2017). As of May of 2018,
the submarine is still missing, with little hope of survival for its forty-four person crew
(Goldman, 2018). However, while this operation may have failed thus far, the increasing use of
UUVs in maritime search and rescue operations may increase the number of lives saved in the
future.


                                                                 References

Chaplain, C. (2017). US Navy ends search for missing Argentine submarine ARA San Juan.
     https://www.standard.co.uk/news/world/us-navy-ends-search-for-missing-argentine-
     submarine-ara-san-juan-a3727766.html

General Dynamics (n.d.). Bluefin-12D autonomous underwater vehicle (AUV). Retrieved from
     https://gdmissionsystems.com/en/products/underwater-vehicles/bluefin-12-d-autonomous-
     underwater-vehicle

Goldman, J. (2018). 6 months after Argentine submarine went missing, families feel ‘invisible’.
     https://abcnews.go.com/International/months-argentine-submarine-missing-families-feel-
     invisible/story?id=55146472

Lei, Z., Jianbo, H., & Shukui, X. (2017). Marine search and rescue of UAV in long-distance
     security modeling simulation. Polish Maritime Research, 95(24), 192-199.

McKirdy, E. (2017). Argentina’s missing submarine: what we know. Retrieved from
     https://edition.cnn.com/2017/11/20/americas/argentina-submarine-what-we-know/index.html
     on 18 November, 2018.

McPhail, S. (2002). Autonomous underwater vehicles: are they the ideal sensor platforms for
     ocean margin science? Ocean Margin Systems, 79-97.

US Navy. (2017). US Navy deploys unmanned submersibles in Argentine submarine search.
     Retrieved from https://www.navy.mil/submit/display.asp?story_id=103420

Werner, B. (2017). US Navy undersea teams now underway as part of Argentine submarine
     search. Retrieved from https://news.usni.org/2017/11/22/u-s-navy-unmanned-underseateams-
     now-underway-part-argentine-submarine-search


Monday, September 10, 2018



Original article: A self-driving car in every driveway? Solid-state LIDAR is the key.

Author: Nick Mokey

URL: https://www.digitaltrends.com/cars/solid-state-lidar-for-self-driving-cars/

                                                                       Summary:

LIDAR is an exteroceptive sensor that is crucial in the operation of self-driving vehicles. It works on the same principle as radar, but utilizes light instead of radio waves. Here is an image of a conventional LIDAR system on an autonomous vehicle:


The module you see on top of these cars is the LIDAR system. It spins rapidly while emitting beams of light. As this light bounces back towards the system, the LIDAR calculates the positions and angles of the reflected light to create a three-dimensional image of the car's surroundings. The car then uses this data to plot a path through its environment. LIDAR is a key exteroceptive sensor on many self-driving cars.

It is also unreliable and expensive. A conventional LIDAR system requires many moving parts, all built around a console that must spin very rapidly in order to work properly. Any system with multiple parts and a high rate of spin speed is potentially prone to many different types of failure. And the cost of one system (upwards of $75,000) means that it costs more than many of the cars to which it is mated. LIDAR is critical technology, but is unrealistic as a mass-market solution in its current form.

Enter solid-state LIDAR. This is what a solid-state module looks like:


This solid-state system, developed by Veoldyne, is one of many modules currently in development. It is a fraction of the size and a fraction of the cost of a conventional spinning LIDAR.

So how does it work? Simply put, "solid state" means no moving parts, and a solid-state LIDAR works by utilizing an array of light emitters to scan a focused slice of the surrounding environment. According to Quanergy, a manufacturer quoted in the article, this drives the price point of a single solid-state LIDAR system to under $1,000, with technology improvements lowering the price further still. A solid-state LIDAR system can only see a limited slice of its environment (something like 90-120 degrees, depending on the unit), necessitating the use of multiple systems for full 360-degree coverage. However, given the unit's size and price point, it is still far more practical and cost effective to install three or more solid-state units on a car than to use just one conventional LIDAR system.

The vast majority of automobile manufacturers agree that LIDAR is key to fully autonomous, self-driving vehicles. LIDAR is a key extereceptive sensor in these vehicles, and solid-state LIDAR promises to make self-driving cars affordable to the masses.