In June of 2007, I deployed to Iraq as an infantry soldier for a period of fourteen months. During that time, my unit and I came across multiple improvised explosive devices (IEDs), many of which were successfully cleared by this little guy:
This is the Talon explosive ordinance disposal (EOD) robot, manufactured by QinetiQ. Every EOD unit in Iraq was using some kind of unmanned ground vehicle like this to safely handle bombs that were intended to take American lives. We affectionately called these UGVs "Johnny 5" (after the robot in the movie Short Circuit II), and they saved our lives more than once. It then is no surprise that I fully endorse the use of UGVs in combat.
At any rate, the ethical, moral, and legal issues facing UGVs and unmanned maritime vehicles (UMVs) are the same which face unmanned aerial vehicles (UAVs): is it right - and, more importantly, is it legal - for a man to use a machine to kill another man a thousand miles away? I believe the answer to the legal question is "yes", given that drone strikes are still used to take out targets in war zones all the time. Ethically, however, is another question. Some argue that using robots in war "cheapens the cost of war, making future war more likely" (Wagner, 2017). It is hard to dispute that logic; one of the biggest deterrents to going into war is the potential loss of human life. However, if your side is in no danger of losing anyone (since robots are doing all the fighting), then the only deterrent becomes negative economic impact. And the fact is, war tends to be good for business in a lot of places. It can be argued that war may also be politically harmful, but if you have the strongest military, you make your own politics.
Still, as long as battlefields exist, I believe that UGVs and UMVs should be on them. If a robot can save a soldier's life, then that is all the proof I need that the vehicles belong there. Watching Johnny 5 safely detonate another IED (and sacrifice itself in the process), I didn't care much for how fair or not war is, or of how loss of human live does or does not serve as a deterrent. I just wanted to get home in one piece. UGVs made that possible in 2007, and they're making it possible now. Unmanned systems have every right to be on the battlefield alongside their human counterparts.
Reference
Wagner, A. R. (2017). Ask an ethicist: is it ethical to use robots to kill in a war? Retrieved from
https://news.psu.edu/story/452771/2017/02/24/ask-ethicist-it-ethical-use-robots-kill-war
Robots, Drones, and Unmanned Systems
Saturday, October 27, 2018
Monday, October 22, 2018
Aurora
Flight Sciences and Socionext Designs Radar Flight Control Module (RFCM)
Small
unmanned aerial systems (sUAS) require collision avoidance systems that
are not only effective at sensing and avoiding obstacles and other aircraft, but that also
are light enough that they do not affect the vehicle’s stability or its flight time. The
Radar Flight Control Module (RFCM), currently in development through a joint effort
between Aurora Flight Sciences and Socionext Designs, may be the perfect device for
the job.
are not only effective at sensing and avoiding obstacles and other aircraft, but that also
are light enough that they do not affect the vehicle’s stability or its flight time. The
Radar Flight Control Module (RFCM), currently in development through a joint effort
between Aurora Flight Sciences and Socionext Designs, may be the perfect device for
the job.
The
RFCM’s biggest asset is its size. The unit packs a twenty-four gigahertz
radar and range measurement software into a single chip barely bigger than a nickel
(Smith, 2018). That’s not an exaggeration, as the picture below demonstrates:
radar and range measurement software into a single chip barely bigger than a nickel
(Smith, 2018). That’s not an exaggeration, as the picture below demonstrates:
According to Smith (2018), the module can detect “multiple
objects, objects in
open spaces, target distance and speed, and more…”, making it more than suitable
for typical sUAS flight. Smith (2018) goes on to state that the RFCM has a very
simple interface, allowing it to be integrated with a wide variety of drone types, not
just UAVs. For our purposes, it is enough to note that the nickel-sized radar can be
fitted to just about any commercially available UAV, and substantially increase the
safety of operation for that UAV.
open spaces, target distance and speed, and more…”, making it more than suitable
for typical sUAS flight. Smith (2018) goes on to state that the RFCM has a very
simple interface, allowing it to be integrated with a wide variety of drone types, not
just UAVs. For our purposes, it is enough to note that the nickel-sized radar can be
fitted to just about any commercially available UAV, and substantially increase the
safety of operation for that UAV.
I
haven’t been able to find any specific numbers on the weight or power
requirements of the RFCM; since it still in development, it is not surprising that the
developers want to keep that information under wraps. However, its small footprint
has already been noted, and it can be safely assumed that the unit does not drain a
significant amount of power in order to operate. This is a critical consideration, given
that forty minutes of flight time is considered long by commercial UAV standards
(Rees, 2018).
requirements of the RFCM; since it still in development, it is not surprising that the
developers want to keep that information under wraps. However, its small footprint
has already been noted, and it can be safely assumed that the unit does not drain a
significant amount of power in order to operate. This is a critical consideration, given
that forty minutes of flight time is considered long by commercial UAV standards
(Rees, 2018).
The two
companies are drawing from a tremendous amount of native
industrial knowledge in order to develop the system. Aurora Flight Sciences is a
subsidiary of Boeing and specializes in development of autonomous aircraft, while
Socionext Designs specializes in “system-on-chip products” (Smith, 2018). “System
on chip” is exactly what it sounds like, and exactly what the RFCM is: a complete,
functioning system, miniaturized and contained on a small chip. Given the pedigree of
the developing companies, it can be safe to assume that the radar will work as
advertised once finally revealed.
industrial knowledge in order to develop the system. Aurora Flight Sciences is a
subsidiary of Boeing and specializes in development of autonomous aircraft, while
Socionext Designs specializes in “system-on-chip products” (Smith, 2018). “System
on chip” is exactly what it sounds like, and exactly what the RFCM is: a complete,
functioning system, miniaturized and contained on a small chip. Given the pedigree of
the developing companies, it can be safe to assume that the radar will work as
advertised once finally revealed.
Collision
avoidance is becoming more and more critical as more and more
commercial drones take to the skies. Commercial drones, especially those of the small,
recreational variety, fly fast and high and can easily cause accidents if measures are not
taken to safeguard both them and the environment around them. The Aurora Flight Sciences/Socionext Radar Flight Control Module may be the best solution to this
problem. A radar on a chip, it is small enough to fit the smallest of UAVs but
powerful enough to ensure safe flight. Only time will tell if the RFCM will perform
as advertised, but if it does, it may make the skies a whole lot safer.
commercial drones take to the skies. Commercial drones, especially those of the small,
recreational variety, fly fast and high and can easily cause accidents if measures are not
taken to safeguard both them and the environment around them. The Aurora Flight Sciences/Socionext Radar Flight Control Module may be the best solution to this
problem. A radar on a chip, it is small enough to fit the smallest of UAVs but
powerful enough to ensure safe flight. Only time will tell if the RFCM will perform
as advertised, but if it does, it may make the skies a whole lot safer.
Smith, P. (2018). Aurora
Flight Sciences and Socionext develop collision detection system for
drones. Retrieved from https://dronebelow.com/2018/01/31/aurora-flight-sciences-socionext-
develop-collision-detection-system-drones/
Rees, M. (2018). New
commercial quadcopter UAV features 40-minute flight time. Retrieved
uav-features-40-minute-flight-time/
Wednesday, October 17, 2018
Nova
Ray Unmanned Underwater Vehicle Control Station Analysis
The
Nova Ray unmanned underwater vehicle (UUV) is one of the most unique
unmanned submersibles on the market. Instead of taking on the torpedo shape favored
by a majority of unmanned underwater vehicles, the Nova Ray takes its styling cues
(and its name) from the stingray, utilizing a big pair of wings to stabilize itself under
water (Rees, 2017). Here is a picture of the Nova Ray, ready to go into the water:
unmanned submersibles on the market. Instead of taking on the torpedo shape favored
by a majority of unmanned underwater vehicles, the Nova Ray takes its styling cues
(and its name) from the stingray, utilizing a big pair of wings to stabilize itself under
water (Rees, 2017). Here is a picture of the Nova Ray, ready to go into the water:
The
picture makes it clear that the craft is controlled by a tether, but what kind
of control station is it tethered to? According to Coral Partners, the company behind
the submersible, each Nova Ray ships with an integrated Control Console (CC)
packaged in a Pelican case (Nova Ray, n.d.). Below is an image of the CC:
of control station is it tethered to? According to Coral Partners, the company behind
the submersible, each Nova Ray ships with an integrated Control Console (CC)
packaged in a Pelican case (Nova Ray, n.d.). Below is an image of the CC:
According
to Nova Ray (n.d.), the CC is comprised of the following
components: laptop computer, ten-inch LCD monitor, and four-axis joystick. Doing
a little research revealed that a typical four-axis joystick moves up, down, left and
right (the first two axes), has a rotating knob (the third axis) and one or more buttons
(the fourth axis), giving a tremendous amount of control options from a single stick
(Engineering 360, n.d.). This stick is used to control both the movement of the Nova
Ray itself and its camera. The laptop and the LCD monitor combine to display both
data that the Nova Ray is gathering and the data generated by its internal sensors, as
components: laptop computer, ten-inch LCD monitor, and four-axis joystick. Doing
a little research revealed that a typical four-axis joystick moves up, down, left and
right (the first two axes), has a rotating knob (the third axis) and one or more buttons
(the fourth axis), giving a tremendous amount of control options from a single stick
(Engineering 360, n.d.). This stick is used to control both the movement of the Nova
Ray itself and its camera. The laptop and the LCD monitor combine to display both
data that the Nova Ray is gathering and the data generated by its internal sensors, as
shown in the picture above: the laptop screen shows
images of the Nova Ray’s pitch,
attitude,and other variables not readily visible, while the LCD screen shows the images
of fish the Nova Ray is recording.
attitude,and other variables not readily visible, while the LCD screen shows the images
of fish the Nova Ray is recording.
Unfortunately,
details about the software don’t get any more specific. according
to Nova Ray (n.d.), the command and control software the CC uses is Windows-based
and proprietary, but does not go into any more detail. Still, by looking at the screens,
we can see that the images on the laptop are quite similar to what we may find in an
airplane cockpit, particularly the attitude gauge on the left side of the laptop screen that
displays the Nova Ray’s position relative to the horizon. I think it is safe to assume that
most of the functions within the software are designed to mimic an airplane instrument
panel as much as possible, especially given that UAV Propulsion Tech states over and
over in their sales material that the Nova Ray “flies” in the water.
to Nova Ray (n.d.), the command and control software the CC uses is Windows-based
and proprietary, but does not go into any more detail. Still, by looking at the screens,
we can see that the images on the laptop are quite similar to what we may find in an
airplane cockpit, particularly the attitude gauge on the left side of the laptop screen that
displays the Nova Ray’s position relative to the horizon. I think it is safe to assume that
most of the functions within the software are designed to mimic an airplane instrument
panel as much as possible, especially given that UAV Propulsion Tech states over and
over in their sales material that the Nova Ray “flies” in the water.
Overall,
I think the Nova Ray’s control console seems well-designed, but there
is one modification I would make: a twin-stick controller. As a gamer, I’m used to using
the left stick of a controller to control movement while using the right stick to control a
camera, and this control setup is so widely used in gaming that I think there would be
a lot of potential crossover between the controls of a video game and that of the Nova
Ray. This is not a novel idea; as far back as 2008, companies like Lockheed Martin were
developing video game-style twin-stick controllers for their drones (Hambling, 2008).
Potential users are already trained in the use of two sticks, and I think that if this was
made an option (like a USB accessory that can be plugged into the included laptop), I
think the Nova Ray could find a wider audience.
is one modification I would make: a twin-stick controller. As a gamer, I’m used to using
the left stick of a controller to control movement while using the right stick to control a
camera, and this control setup is so widely used in gaming that I think there would be
a lot of potential crossover between the controls of a video game and that of the Nova
Ray. This is not a novel idea; as far back as 2008, companies like Lockheed Martin were
developing video game-style twin-stick controllers for their drones (Hambling, 2008).
Potential users are already trained in the use of two sticks, and I think that if this was
made an option (like a USB accessory that can be plugged into the included laptop), I
think the Nova Ray could find a wider audience.
Still,
the Nova Ray is an interesting craft, and its command and control console
seems well put-together. Throw in the option to use a gaming controller, and I just
might think about making a purchase.
seems well put-together. Throw in the option to use a gaming controller, and I just
might think about making a purchase.
References
Engineering 360 (n.d.). 4-axis control industrial joysticks datasheets. Retrieved from
Hambling, D. (2018). Game controllers driving drones, nukes. Retrieved from
https://www.wired.com/2008/07/wargames/
Nova Ray (n.d.). NOVA
RAY inspection class remotely operated vehicle (ROV). Retrieved from
http://www.novaray.com/novaray_outline.htm
Rees, M. (2017). UAV
Propulsion Tech to distribute Nova Ray ROV. Retrieved from
ray-rov/
Thursday, October 4, 2018
Simtoo: Data and Sensors
The Simtoo Dragonfly bills itself as the “world’s
first foldable drone” (Simtoo, n.d.). Its
dimensions are tiny, as shown in the picture below,
taken directly from Simtoo’s website:
A UAV this small creates interesting challenges for
designers, both in terms of sensors and memory storage.
The UAV uses micro SD cards to store both photos and
videos (Custer, 2016). This is a small version of the swappable memory system
used by larger drones, and it is a perfect fit for the Dragonfly, as micro SD
cards have the footprint of a thumbnail but can have storage capacities in
excess of two hundred gigabytes (Estrada, 2017). Eliminating onboard storage
was no doubt one of the keys in keeping the Dragonfly small.
According to Simtoo (n.d.), the Dragonfly has six
sensors: a GPS, a gyroscope, a barometer, a magnetometer, an accelerometer, and
a camera capable of taking 4K photos and video. The spec sheet doesn’t indicate
how much power each device draws, but it does state that its working voltage is
3.7 volts, meaning that the Dragonfly utilizes 3.7 volts, on average, during
one hour of flight (Simtoo, n.d.). From this number, we can infer that the
sensors working together do not draw much power at all; in fact, the biggest
consumer of battery power on the Dragonfly is the motor, which is rated at 100 watts.
The sensor with the biggest impact on the Dragonfly’s
data treatment strategy is obviously the camera; as mentioned earlier, the
camera is capable of 4K images, which can quickly consume the available empty
space on the inserted micro SD Card. It becomes clear why Simtoo chose to make micro-SD
cards the sole source of onboard memory, as users can swap cards quickly when
ultra-high definition images consume too much of the installed card’s storage
space.
The Dragonfly seems to work well, but it is not
perfect. The swappable SD cards have the potential to give the operator virtual
acres of space, but nothing is in place to compensate for the drone running out
of storage in the middle of a crucial shot. For this reason, I am proposing that
Simtoo add cloud-based storage to its system. Cloud storage has the ability to
give the Dragonfly nearly limitless space, enabling uninhibited access for the
user as they no longer need to watch and see how much memory is left and how
much shooting they can do before cards need to be swapped.
Simtoo can take this concept further, by adding basic photo
editing options to their storage cloud. This way, a pilot can take a photo with
the drone, access it in the cloud, edit it, and then submit for publishing to
Facebook, Twitter, or any number of other social media sites.
The Simtoo Dragonfly is an intriguing UAV. It is a
drone whose tiny size is its biggest selling point, a point around which the
rest of the system’s design revolves; this design is the reason the designers
looked for the smallest possible device to provide onboard storage. It is also the
reason the UAV’s systems draw as little power as possible. I think the
Dragonfly has potential, but that it needs cloud storage to fully reach that
potential.
References
Simtoo (n.d.). Dragonfly.
Retrieved from http://www.simtoo.com/page.php?id=132#page1
on 4
October,
2018.
Custer, C. (2016). Dragonfly
drone wants to be a slightly cheaper alternative to DJI (review).
Estrada, M. (2017). The 5 highest capacity micro SD cards you can get right now on Amazon.
microsdxc/
Thursday, September 27, 2018
Sensor Placement in Photography and Racing Drones
Sensor
Placement in Photography and Racing Drones
Sensor
placement is one of the most crucial elements of unmanned system design, and
the purpose of the unmanned system dictates exactly
where and how critical sensors are
placed.Quadcopter drones (the four-rotor helicopter drones that have become
ubiquitous in electronics stores) all look similar from a distance, but vary wildly
in their sensor placement depending on their purpose. Two such unmanned systems
are photography drones and FPV racing drones, which superficially look similar
but are actually worlds apart in both purpose and sensor placement.
placed.Quadcopter drones (the four-rotor helicopter drones that have become
ubiquitous in electronics stores) all look similar from a distance, but vary wildly
in their sensor placement depending on their purpose. Two such unmanned systems
are photography drones and FPV racing drones, which superficially look similar
but are actually worlds apart in both purpose and sensor placement.
DJI
is one of the industry leaders in hobby and commercial photography drones,
and their Phantom 4 drone stands at the top of their professional-level UAV
catalog (Mellors, n.d.). According to Mellors (n.d.), the chief feature of the
Phantom 4 is the 4K camera, which enables the user to record ultra-high
resolution video. Below is an image of the drone:
and their Phantom 4 drone stands at the top of their professional-level UAV
catalog (Mellors, n.d.). According to Mellors (n.d.), the chief feature of the
Phantom 4 is the 4K camera, which enables the user to record ultra-high
resolution video. Below is an image of the drone:
This is a great example of the drone’s camera placement. It is mounted on a
gimbal on the bottom of the craft, which gives it the ability to both rotate and
pan up and down. This allows the camera to adjust to a wide variety of
positions while the drone is hovering in place, giving the pilot the ability
to capture a wide variety of images and angles without having to reposition
the drone. The gimbal also compensates for the UAV’s movement in the air,
enabling the camera to record stable footage even if the craft is moving
(Mellors, n.d.). Sensor placement is key in the DJI’s functionality as a
photography tool.
Camera
drones aren’t the only quadcopters in the skies; first-person view (FPV)
racing drones are also becoming popular. FPV drone racing is a growing sport
where competitors fly their drones through pre-built tracks (Drone Enthusiast, 2016).
The “FPV” is the key term here; operators utilize displays which show a first-person
view from the drone, almost as if the pilot is sitting in the drone’s cockpit. The
FPV requirement dictates very specific mounting for the racing UAV’s camera,
namely its nose:
racing drones are also becoming popular. FPV drone racing is a growing sport
where competitors fly their drones through pre-built tracks (Drone Enthusiast, 2016).
The “FPV” is the key term here; operators utilize displays which show a first-person
view from the drone, almost as if the pilot is sitting in the drone’s cockpit. The
FPV requirement dictates very specific mounting for the racing UAV’s camera,
namely its nose:
This is the Walkera Rodeo 150 FPV racing drone. Just
like the DJI Phantom 4,
it is a quadcopter, but that is where the similarities end. The Rodeo is a lightweight
craft; it only weighs a hundred and fifty grams, and is designed for indoor and
outdoor racing (Brown, n.d.). According to Brown (n.d.), it is also designed for
aerobatic flight.
it is a quadcopter, but that is where the similarities end. The Rodeo is a lightweight
craft; it only weighs a hundred and fifty grams, and is designed for indoor and
outdoor racing (Brown, n.d.). According to Brown (n.d.), it is also designed for
aerobatic flight.
Since
this is an FPV racing drone, the camera is mounted to the UAV’s nose,
centered in such a way as to simulate a cockpit view. According to Brown (n.d.),
the camera’s position is fixed in place, with the optics allowing for a wide-angle
view of one hundred and ten degrees. As Brown (n.d.) points out, this is a suboptimal
position for photography, which removes the drone from consideration for serious
photographic work. However, in its intended use as an FPV viewfinder, the drone
works very well, with the UAV’s electronics transmitting the camera’s
centered in such a way as to simulate a cockpit view. According to Brown (n.d.),
the camera’s position is fixed in place, with the optics allowing for a wide-angle
view of one hundred and ten degrees. As Brown (n.d.) points out, this is a suboptimal
position for photography, which removes the drone from consideration for serious
photographic work. However, in its intended use as an FPV viewfinder, the drone
works very well, with the UAV’s electronics transmitting the camera’s
data to a viewing device with almost no lag.
Sensor
placement is a critical element of UAV design, and the drone’s purpose often
decides where key sensors are placed. The most
important sensor on both a racing
drone or a photography UAV is the camera, although camera placement varies wildly
from one platform to another. The DJI Phantom 4 carries its camera on its belly in
order to allow for maximum photographic flexibility; the Walkera Rodeo 150 carries
its camera on its nose, in order to give its pilot the best possible view when careening
through a racetrack at high speeds. Both drones work well in their intended roles, and
sensor placement is key to their successful function.
drone or a photography UAV is the camera, although camera placement varies wildly
from one platform to another. The DJI Phantom 4 carries its camera on its belly in
order to allow for maximum photographic flexibility; the Walkera Rodeo 150 carries
its camera on its nose, in order to give its pilot the best possible view when careening
through a racetrack at high speeds. Both drones work well in their intended roles, and
sensor placement is key to their successful function.
Brown, J. Walkera
Rodeo 150: a compact FPV quadcopter for racing. Retrieved from
http://mydronelab.com/reviews/walkera-rodeo-150.html
on 27 September 2018.
Drone Enthusiast (2016). FPV drone racing – the UAV sport about to hit the big time.
Retrieved from https://www.dronethusiast.com/fpv-drone-racing/ on
27 September 2018.
Retrieved from https://www.dronethusiast.com/fpv-drone-racing/ on
27 September 2018.
Mellors, J. (n.d.). DJI Phantom 4 review for photographers. Retrieved from
https://shotkit.com/dji-phantom-4-review-for-photographers/
on 27 September 2018.
Wednesday, September 19, 2018
US Navy Bluefin-12(D)
In November of 2017, an Argentinian Navy submarine named the ARA San Juan
disappeared while traveling from an Argentinian naval base in Tierra del Fuego to its home port
in Mar del Plata (McKirdy, 2017). According to McKirdy (2017), the submarine, if underwater,
was in danger of running out of oxygen in as little as seven days. The vessel’s disappearance
sparked a frantic search from both the Argentine Navy and its international partners. One of
those partners, the US Navy, deployed four unmanned underwater vehicles (UUVs) in support of
the search (Werner, 2017). According to Werner (2017), one of those vehicles was the Bluefin-
12(D).
The Bluefin-12(D), manufactured by General Dynamics, is a torpedo-shaped, highly
modular UUV capable of diving to a maximum depth of fifteen hundred meters (General
Dynamics, n.d.). According to General Dynamics (n.d.), its primary prioproceptive sensor is an
inertial measurement unit (IMU), which internally monitors the UUV’s speed and pitch in the
water. General Dynamics (n.d.) also states that one of the craft’s chief exteroreceptors is a
doppler velocity log (DVL), which measures the doppler effect created by the UUV as it
travels through water and then uses that information to confirm the IMU’s data. Rounding out
the suite of navigation sensors are a GPS and a compass, all of which aid the Bluefin in
navigating deep water.
The US Navy made an official statement that the Bluefin-12(D) was being utilized in the
search, but did not specify what search and rescue-specific modifications were made to the craft
(US Navy, 2017). However, a quick look at the Navy’s official photo of the Bluefin shows that it
is missing one modification that I think would greatly improve the craft’s ability to carry out its
mission: a manipulator arm.
Bluefin-12(D). Source: US Navy Public Affairs Office.
I believe that a manipulator arm would greatly increase the Bluefin’s effectiveness, as it
would give the craft the ability to move debris and remove obstacles, thereby clearing a path for
manned rescue efforts. While not specified in the craft’s spec sheet, the Bluefin’s modularity
indicates that it can be mounted with a manipulator arm without too much difficulty.
Unmanned aerial vehicles (UAVs) can be combined with unmanned surface and
underwater vehicles to maximize search effectiveness in maritime environments. In 2017, a
research team from the Air Force Engineering University in China conducted a study that
determined that UAVs can be far more effective than manned aircraft in maritime search-and-
rescue operations (Lei, Jianbo, & Shukui, 2017). According to Lei et al. (2017), the ability of
UAVs to maintain sustained operations while providing both automatic and manual scanning
greatly increases the odds of rescuers finding lost personnel. By utilizing both a UAV and a
UUV, rescuers can systematically search large areas of water for longer periods of time (and at a
much lower cost) than by utilizing manned craft for the same purpose.
UUVs have many advantages over their manned counterparts when it comes to maritime
operations. They are generally far less expensive to operate than manned craft and present no
danger to pilots or crew while operating (McPhail, 2002). They are also small and compact in a
way that manned craft cannot be. The Bluefin-12(D) is small enough to be maneuverable
in tight spaces, a potentially crucial ability when attempting to navigate wreckage. A craft the
size of the Bluefin is far too small to accommodate a human crew.
The story of the Argentine submarine does not have a happy ending; the US Navy called
off its search for vehicle six weeks after it disappeared (Chaplain, 2017). As of May of 2018,
the submarine is still missing, with little hope of survival for its forty-four person crew
(Goldman, 2018). However, while this operation may have failed thus far, the increasing use of
UUVs in maritime search and rescue operations may increase the number of lives saved in the
future.
References
Chaplain, C. (2017). US Navy ends search for missing Argentine submarine ARA San Juan.
https://www.standard.co.uk/news/world/us-navy-ends-search-for-missing-argentine-
submarine-ara-san-juan-a3727766.html
General Dynamics (n.d.). Bluefin-12D autonomous underwater vehicle (AUV). Retrieved from
https://gdmissionsystems.com/en/products/underwater-vehicles/bluefin-12-d-autonomous-
underwater-vehicle
Goldman, J. (2018). 6 months after Argentine submarine went missing, families feel ‘invisible’.
https://abcnews.go.com/International/months-argentine-submarine-missing-families-feel-
invisible/story?id=55146472
Lei, Z., Jianbo, H., & Shukui, X. (2017). Marine search and rescue of UAV in long-distance
security modeling simulation. Polish Maritime Research, 95(24), 192-199.
McKirdy, E. (2017). Argentina’s missing submarine: what we know. Retrieved from
https://edition.cnn.com/2017/11/20/americas/argentina-submarine-what-we-know/index.html
on 18 November, 2018.
McPhail, S. (2002). Autonomous underwater vehicles: are they the ideal sensor platforms for
ocean margin science? Ocean Margin Systems, 79-97.
US Navy. (2017). US Navy deploys unmanned submersibles in Argentine submarine search.
Retrieved from https://www.navy.mil/submit/display.asp?story_id=103420
Werner, B. (2017). US Navy undersea teams now underway as part of Argentine submarine
search. Retrieved from https://news.usni.org/2017/11/22/u-s-navy-unmanned-underseateams-
now-underway-part-argentine-submarine-search
In November of 2017, an Argentinian Navy submarine named the ARA San Juan
disappeared while traveling from an Argentinian naval base in Tierra del Fuego to its home port
in Mar del Plata (McKirdy, 2017). According to McKirdy (2017), the submarine, if underwater,
was in danger of running out of oxygen in as little as seven days. The vessel’s disappearance
sparked a frantic search from both the Argentine Navy and its international partners. One of
those partners, the US Navy, deployed four unmanned underwater vehicles (UUVs) in support of
the search (Werner, 2017). According to Werner (2017), one of those vehicles was the Bluefin-
12(D).
The Bluefin-12(D), manufactured by General Dynamics, is a torpedo-shaped, highly
modular UUV capable of diving to a maximum depth of fifteen hundred meters (General
Dynamics, n.d.). According to General Dynamics (n.d.), its primary prioproceptive sensor is an
inertial measurement unit (IMU), which internally monitors the UUV’s speed and pitch in the
water. General Dynamics (n.d.) also states that one of the craft’s chief exteroreceptors is a
doppler velocity log (DVL), which measures the doppler effect created by the UUV as it
travels through water and then uses that information to confirm the IMU’s data. Rounding out
the suite of navigation sensors are a GPS and a compass, all of which aid the Bluefin in
navigating deep water.
The US Navy made an official statement that the Bluefin-12(D) was being utilized in the
search, but did not specify what search and rescue-specific modifications were made to the craft
(US Navy, 2017). However, a quick look at the Navy’s official photo of the Bluefin shows that it
is missing one modification that I think would greatly improve the craft’s ability to carry out its
mission: a manipulator arm.
Bluefin-12(D). Source: US Navy Public Affairs Office.
I believe that a manipulator arm would greatly increase the Bluefin’s effectiveness, as it
would give the craft the ability to move debris and remove obstacles, thereby clearing a path for
manned rescue efforts. While not specified in the craft’s spec sheet, the Bluefin’s modularity
indicates that it can be mounted with a manipulator arm without too much difficulty.
Unmanned aerial vehicles (UAVs) can be combined with unmanned surface and
underwater vehicles to maximize search effectiveness in maritime environments. In 2017, a
research team from the Air Force Engineering University in China conducted a study that
determined that UAVs can be far more effective than manned aircraft in maritime search-and-
rescue operations (Lei, Jianbo, & Shukui, 2017). According to Lei et al. (2017), the ability of
UAVs to maintain sustained operations while providing both automatic and manual scanning
greatly increases the odds of rescuers finding lost personnel. By utilizing both a UAV and a
UUV, rescuers can systematically search large areas of water for longer periods of time (and at a
much lower cost) than by utilizing manned craft for the same purpose.
UUVs have many advantages over their manned counterparts when it comes to maritime
operations. They are generally far less expensive to operate than manned craft and present no
danger to pilots or crew while operating (McPhail, 2002). They are also small and compact in a
way that manned craft cannot be. The Bluefin-12(D) is small enough to be maneuverable
in tight spaces, a potentially crucial ability when attempting to navigate wreckage. A craft the
size of the Bluefin is far too small to accommodate a human crew.
The story of the Argentine submarine does not have a happy ending; the US Navy called
off its search for vehicle six weeks after it disappeared (Chaplain, 2017). As of May of 2018,
the submarine is still missing, with little hope of survival for its forty-four person crew
(Goldman, 2018). However, while this operation may have failed thus far, the increasing use of
UUVs in maritime search and rescue operations may increase the number of lives saved in the
future.
References
Chaplain, C. (2017). US Navy ends search for missing Argentine submarine ARA San Juan.
https://www.standard.co.uk/news/world/us-navy-ends-search-for-missing-argentine-
submarine-ara-san-juan-a3727766.html
General Dynamics (n.d.). Bluefin-12D autonomous underwater vehicle (AUV). Retrieved from
https://gdmissionsystems.com/en/products/underwater-vehicles/bluefin-12-d-autonomous-
underwater-vehicle
Goldman, J. (2018). 6 months after Argentine submarine went missing, families feel ‘invisible’.
https://abcnews.go.com/International/months-argentine-submarine-missing-families-feel-
invisible/story?id=55146472
Lei, Z., Jianbo, H., & Shukui, X. (2017). Marine search and rescue of UAV in long-distance
security modeling simulation. Polish Maritime Research, 95(24), 192-199.
McKirdy, E. (2017). Argentina’s missing submarine: what we know. Retrieved from
https://edition.cnn.com/2017/11/20/americas/argentina-submarine-what-we-know/index.html
on 18 November, 2018.
McPhail, S. (2002). Autonomous underwater vehicles: are they the ideal sensor platforms for
ocean margin science? Ocean Margin Systems, 79-97.
US Navy. (2017). US Navy deploys unmanned submersibles in Argentine submarine search.
Retrieved from https://www.navy.mil/submit/display.asp?story_id=103420
Werner, B. (2017). US Navy undersea teams now underway as part of Argentine submarine
search. Retrieved from https://news.usni.org/2017/11/22/u-s-navy-unmanned-underseateams-
now-underway-part-argentine-submarine-search
Monday, September 10, 2018
Original article: A self-driving car in every driveway? Solid-state LIDAR is the key.
Author: Nick Mokey
URL: https://www.digitaltrends.com/cars/solid-state-lidar-for-self-driving-cars/
Summary:
LIDAR is an exteroceptive sensor that is crucial in the operation of self-driving vehicles. It works on the same principle as radar, but utilizes light instead of radio waves. Here is an image of a conventional LIDAR system on an autonomous vehicle:
The module you see on top of these cars is the LIDAR system. It spins rapidly while emitting beams of light. As this light bounces back towards the system, the LIDAR calculates the positions and angles of the reflected light to create a three-dimensional image of the car's surroundings. The car then uses this data to plot a path through its environment. LIDAR is a key exteroceptive sensor on many self-driving cars.
It is also unreliable and expensive. A conventional LIDAR system requires many moving parts, all built around a console that must spin very rapidly in order to work properly. Any system with multiple parts and a high rate of spin speed is potentially prone to many different types of failure. And the cost of one system (upwards of $75,000) means that it costs more than many of the cars to which it is mated. LIDAR is critical technology, but is unrealistic as a mass-market solution in its current form.
Enter solid-state LIDAR. This is what a solid-state module looks like:
This solid-state system, developed by Veoldyne, is one of many modules currently in development. It is a fraction of the size and a fraction of the cost of a conventional spinning LIDAR.
So how does it work? Simply put, "solid state" means no moving parts, and a solid-state LIDAR works by utilizing an array of light emitters to scan a focused slice of the surrounding environment. According to Quanergy, a manufacturer quoted in the article, this drives the price point of a single solid-state LIDAR system to under $1,000, with technology improvements lowering the price further still. A solid-state LIDAR system can only see a limited slice of its environment (something like 90-120 degrees, depending on the unit), necessitating the use of multiple systems for full 360-degree coverage. However, given the unit's size and price point, it is still far more practical and cost effective to install three or more solid-state units on a car than to use just one conventional LIDAR system.
The vast majority of automobile manufacturers agree that LIDAR is key to fully autonomous, self-driving vehicles. LIDAR is a key extereceptive sensor in these vehicles, and solid-state LIDAR promises to make self-driving cars affordable to the masses.
Subscribe to:
Posts (Atom)