how to create the 3D effect in Photoshop: Open an RGB image, then go to the Channels palette and click on the Red channel. Go under the Filter menu, under Other, and choose Offset. For Horizontal enter –5 and set Vertical to zero. For Undefined Areas, choose Repeat Edge Pixels, then click OK. In the Channels palette, click on the RGB channel to reveal the effect. Then, lastly, you have to determine which part you want to appear as “coming out of the image” toward the person viewing it. Switch to the History Brush (Y), and using a soft-edged brush, paint over the area you want to “jump out” from the image. As you paint with the History Brush, you’ll see your original untouched image paint back in (don’t sweat it, that’s what it’s supposed to do). Now all you have to do is order the glasses.
DefenseReview (DR) recently received a press release on a light-based see-through-wall/see-around-corners x-ray-vision-type technology being developed researchers at the SMU Lyle School of Engineering with funding help from DARPA (Defense Advanced Research Projects Agency), and it’s a really fascinating read. SMU calls the project “OMNISCIENT” (Obtaining Multipath & Non-line-of-sight Information by Sensing Coherence & Intensity with Emerging Novel Techniques–a real mouthful). DARPA just calls it “REVEAL“.
Being light-based, the OMNISCIENT/REVEAL tech is completely different from earlier millimeter-wave radar-based see-through-wall imaging technologies like Eureka Aerospace’s Impulse Synthetic Aperture Radar (ImpSAR) and Time Domain’s RV2000 SoldierVision ultra-wideband “PuleOn” radar tech, both of which DR’s covered.
The OMNISCIENT/REVEAL tech utilizes a computer algorithm to “unscramble the light that bounces off irregular surfaces to create a holographic image of hidden objects”, according to the press release. “This will allow us to build a 3-D representation – a hologram – of something that is out of view,” said Marc Christensen, dean of the Bobby B. Lyle School of Engineering at SMU and “principal investigator for the project”. The following quotes from Christensen are the most interesting portions of the press release:
“Light bounces off the smooth surface of a mirror at the same angle at which it hits the mirror, which is what allows the human eye to ‘see’ a recognizable image of the event – a reflection.”
DefenseReview (DR) got to take a look at Revision Multi-Use Bump Shell/modular ballistic combat helmet prototypes with skeletonized accessory rail mount systems, which makes one wonder why it took so long for a company to offer just such a product. The Unit (CAG/Delta Force) used what were essentially skateboard helmets as bump helmets during Operation Gothic Serpent and the subsequent Battle of Mogadishu (1993) in Somalia, due to their ultra-light weight and ability to protect against a reasonable amount of kinetic impact. The Revision Multi-Use bump shell just takes this concept and adds the ability to up-armor it with modular ballistic armor panels, thus turning it into a ballistic combat helmet.
Vineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites.
Cooler objects glow faintly at longer wavelengths of light, while hotter objects glow more brightly at shorter wavelengths. Our Sun’s temperature is a blistering 5,778 K (9,940° F), which is so hot that it glows brightest at visible wavelengths of light (around 0.4 – 0.7 microns). People, who are much cooler (310 K, 98° F), actually glow as well, but in infrared light with a wavelength of around 10 microns. A micron is a millionth of a meter.
light in the SWIR band is not visible to the human eye. The visible spectrum extends from wavelengths of 0.4 microns (blue, nearly ultraviolet to the eye) to 0.7 microns (deep red). Wavelengths longer than visible wavelengths can only be seen by dedicated sensors, such as InGaAs. But, although light in the shortwave infrared region is not visible to the eye, this light interacts with objects in a similar manner as visible wavelengths. That is, SWIR light is reflective light; it bounces off of objects much like visible light. As a result of its reflective nature, SWIR light has shadows and contrast in its imagery. Images from an InGaAs camera are comparable to visible images in resolution and detail; however, SWIR images are not in color. This makes objects easily recognizeable and yields one of the tactical advantages of the SWIR, namely, object or individual identification.
Search and rescue (SAR) operations encompass the search for and provision of aid to people or objects that are in distress or imminent danger. SAR operations involve large ground teams, a centralized coordination and preferably aerial assistance. Traditionally, aerial assistance was provided by camera equipped helicopters that enable overviews of the search area and communicate the camera feed to the ground team.
Unmanned aircraft systems provide a cheaper, safer way to incorporate aerial assistance in SAR operations. These systems can be readily available due to their modest size, provide normal as well as thermal images, which can be downlinked real time enabling direct implementation and do not interfere with the operation as a result of their low noise footprint.