Skip to main content

Posts

Showing posts from 2011

Waveform Lidar Simulation

One of the more interesting areas of development right now for laser radar are "waveform" LIDAR systems. Unlike Linear-mode (Lm) or Geiger-mode (Gm) system that collect a relatively small number of height measurements per GSD, a "waveform" LIDAR (wLIDAR) essentially digitizes the returned flux during the time window at some time resolution. That means you have intensity information at all the ranges rather than just a discrete set of them. This allows for some very complex post-processing of the data. The "Airborne Taxonomic Mapping System" (AToMS) operated by the Carnegie Airborne Observatory (CAO) is being used to better quantify the biomass in complex tree canopies and grasslands. The remote sensing arm of the National Ecological Observatory Network (NEON) plans on operating a wLIDAR on it's Airborne Observation Platform (AOP). Most wLIDAR systems feature a high performance, single-pixel detector coupled to a high performance digitizer. To ob

Primitives

Like most ray tracing environments, DIRSIG offers a number of convenient primitive objects to use in place of facetized geometry. Aside from offering a quick way to construct a (simple) scene, primitives have mathematically defined surfaces so there is no reason to worry about edges between facets. In contrast to vertex normal interpolation which helps smooth the appearance of facetized objects, primitives offer true, smooth geometry which can be convenient for working out precise radiometry problems. We've already made some of these primitive objects available under the old object database (ODB) inputs, which has been shown in the PrimitiveObjects1 demo. These objects are shown below. The primitives shown are a box, a sphere, a cylinder, a ground plane, a disk, and a two material disk (introduced to quickly model a Secchi disk for virtually measuring turbidity). We've now updated the "glist" format to support these primitives and to provide previously unavail

Data-driven focal planes and modeling Bayer Pattern CFAs

One of the new features that will be in the upcoming DIRSIG 4.4.2 release is something that we have been calling "data-driven focal planes". This mechanism had been on the drawing board for many years and we finally had a reason to implement it on our contract to help model the Landsat Data Continuity Mission (LDCM) sensors, or what many of you will come to know as Landsat 8 when it is launched in December 2012. The concept of the "data-driven" focal plane was to provide an alternative to the parameterized detector array geometry, where you specify the focal length, the number of pixels, the pixel pitch and pixel spacings. Instead a "data-driven" focal plane uses a "database" where each record describes both the geometric and radiometric properties of a single pixel. The key feature is that the geometric and optical properties are described at the front of the aperture. This allows the model to ingest optical predictions from complex optical m

Work in progress: Whole earth model

One of the fundamental modules in D5 supplies a model of the earth (or other, roughly ellipsoid planet) to the core representation of the DIRSIG universe. The distributed version of D5 will likely contain a simple, material-mapped sphere representation of the earth by default (to keep download sizes reasonable), with the option to obtain more detailed (and more correct) models. To this end we've been working on an earth model based on the WGS84 reference ellipsoid and SRTM ( Shuttle Radar Topography Mission ) DEM data. Since this is a very large data set (the 500m data we're using currently is 4.7GB in its raw form) it is an ideal case to test some of our new large file handling routines and integrating them into geometry interactions. We model the earth as a collection of triangle mesh patches representing cells (primarily hexagons) of a geodesic, specifically the ISEA aperture 4 resolution 7 grid (i.e. 163,842 index cells). Each patch can be loaded into memory independently,

Looking ahead to DIRSIG5

The transition from DIRSIG 3 to DIRSIG 4 was a major shift towards a more modern code architecture and a more flexible simulation design. The goal was to address the development challenges faced at that time and to be ready for the new features and capabilities that would come. During the past few years, we have added new modalities (particularly LiDAR), new platform support (Windows), an entirely new GUI interface, native 3D radiative transfer code, extensive sensor and platform modeling, support for more complex scene radiometry, and many other features to support user needs. Every new piece of code has expanded, and in some cases, stressed, the underlying DIRSIG 4 framework and its original design. While this evolution has been successful functionally, we have made sacrifices to execution speed and our agility adapting to new challenges and projects well beyond the scope of what was imagined, or even possible, when that framework was constructed. DIRSIG 5 (D5) represents another maj

Blender to DIRSIG Integration: Visualization and Animation Tool

As many of you may have read in past posts by Niek , we have been using the Blender software heavily in the construction of scenes so they can be ingested into DIRSIG . While we have many Blender tools that work in the editing and exporting of existing scenes, they only utilize a small subset of features that exist in Blender. Blender can be viewed as the vi (or emacs) of the 3D animation world which means it can be very powerful but at times opaque to the newly initiated. After paying our dues in the apprenticeship of Blender knowledgedom, we wanted to make accessible to DIRSIG users the capabilities that have made Blender the de facto standard of the 3D animation domain. In an ongoing project that addresses the need of data sets for training machine learning algorithms and assessing the performance of target detection/exploitation algorithms, DIRS has focused on implementing tools for producing numerous instantiation of simulated scenes. Blender has been an integral part

Refinery Scene

A new "refinery" scene will be included in the next DIRSIG release. The scene was inspired by the Ras Lanuf oil refinery in Libya.  It features an industrial site sitting on a desert-water interface.  Two DIRSIG renderings are shown below. The highlight is the refinery itself, featuring pipes and general industrial clutter.  The plant connects to a railroad track and an access road.  A small oil carrying ship is anchored nearby.  The desert itself is filled with shrubs. As with the Warehouse scene, the geometry was placed using our Blender scripts .  This refinery scene took roughly a week to construct.

Visit us at SPIE DSS in April

The RIT Digital Imaging and Remote Sensing (DIRS) Lab has a booth (#3014) in the Exhibition Hall at this year's SPIE Defense, Security, and Sensing symposium in Orlando, FL (April 25-27). Adam Goodenough, Niek Sanders and myself will be hanging out at our booth talking to conference goers about DIRSIG. Other lab personel will be on hand to discuss RIT's other research interests ranging from MSI and HSI algorithm development to rapid data collection and delivery. In addition to the exhibition, we will be milling about the Technical Conference. Specifically, I am chairing Session #9 entitled "Landsat Data Continuity Mission" in the "Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XVII" conference on Wednesday morning. In that session, I will be showing some of the work we are doing with DIRSIG to support pre-flight modeling of the next-generation Landsat system. These new features to streamline the modeling of space-ba

DIRSIG 4.4.1 Final Release

The Final Release of DIRSIG 4.4.1 is out. This release is primarily a maintenance release to address a few bugs in the 4.4.0 release. The following is a summary of fixes and features added in this release: First release to include a Windows 64-bit version Improvements to correlated deviate models used for jitter The frequency spectrum can now (optionally) include the phase New demonstrations Indoors1 ExtinctionProps1 Blender plug-ins for DIRSIG file formats Various LIDAR related improvements Speedups for atmospheric backscatter returns Bug fix for Linear-mode APD LIDAR detector model Various GUI bug fixes and improvements Improvements to streamline integration with SUMO and CityEngine SUMO vehicle track import utility included on all platforms Simplified shared library requirements for Solaris 9 and Solaris 10. Initial MuSES support built-in for the following platforms Windows 32-bit Windows 64-bit We wish to thank those early adopter users for the

Attributing OBJs with Blender

In a previous post, we discussed the advantages of using OBJ files in DIRSIG. This tutorial video shows how Blender can assign DIRSIG materials to an OBJ geometry mesh.  Basic familiarity with Blender is a prerequisite. Attributing an OBJ with DIRSIG materials (video) The idea is simple: in the OBJ file, the "use material" ( usemtl ) string will specify a DIRSIG material id rather than just an arbitrary material name.  The OBJ's "material library" ( mtllib ) string is ignored by DIRSIG.

Meet a Capture Method: Raw

Many users want just raw radiance data from DIRSIG.  Rather than apply a built-in instrument model, they do their own post-processing of the radiance results.  The "Raw" capture method can be used to achieve this.  It writes raw DIRSIG results without applying any spatial or spectral integration. Here's a snapshot of the Raw Capture Method configuration dialog.  We'll now work through each section.   The Spectral Range controls the wavelengths for which DIRSIG will calculate and output results.   In this example, DIRSIG will compute from 0.40 microns to 2.50 microns in 0.01 micron increments.  The output image will have 211 bands of result data.  The delta must divide the range in to a whole number of bins.  For example, the range 0.4 to 0.7 microns with a 0.2 micron delta will result in an error message when DIRSIG runs. The Spatial Response controls how many results are computed and written out per detector array element.  By default, there is a single pi

Scene Building with Blender

For quite some time, the DIRS lab has been using Blender to build small DIRSIG scenes. Blender is an advanced, open source, cross-platform, 3-D modeling package. It is available free of charge, but it has a fairly steep learning curve. We've created Blender plug-ins allowing us import and export some of DIRSIG's native formats, including GDB and ODB. Blender also allows us to use geometry with the OBJ format . We can now create new scenes from scratch by assembling geometry, placing instances using Blender, and then saving out a DIRSIG ODB file. Traditionally, we have distributed the Bulldozer application for both attributing materials to geometry and constructing scenes. This set of plug-ins allows Blender to be used as an alternative to Bulldozer for constructing scenes. Starting with the official DIRSIG 4.4.1 release, we will be distributing these Blender scripts to the DIRSIG user community. All future DIRSIG distributions will include these scripts in the dir

Ph.D. Work Leveraging DIRSIG

Last week, Brian Flusche (one of our students) defended his Ph.D. dissertation entitled "Analysis of Multi-Modal Fusion for Target Detection in an Urban Environment". Brian's research reflects how many people are using DIRSIG to simulate data in an effort to explore the value of new and novel exploitation schemes. In this case, Brian was trading the value of hyper-spectral imaging vs. polarimetric imaging vs. a combination of the two for doing target detection in an urban environment. In addition to his great use of DIRSIG (you can't expect us to be unbiased!), Brian was able to draw some very interesting and valuable conclusions. It you would like too watch his defense, it was recorded and is available here . A copy of his dissertation is available here . The DIRSIG Team would like to congratulate Brian on his work and the successful defense of his Ph.D research. Good job, Dr. Brian.

Indoors1 Demo

We've just put together another DIRSIG "demo" simulation. It shows a camera placed indoors. The scene is illuminated by both a secondary source at the ceiling and through a window cut-out at the far end of the room. This demonstration will be included in the next DIRSIG release.

MegaScene1 and MicroScene1 Updates

We have placed an update to the MegaScene1 distribution on myDIRSIG. The major aspect of this release is that it comes with a set of DIRSIG4 .scene files. This release also includes updates to material properties including the conversion of the tree leaf optical properties from extinction to transmission. We have found that changing this optical property provided speed ups because a large number of calls to the exponent function (to compute the transmission from the extinction) are eliminated. You can also download an update to MicroScene1. Like MegaScene1, we had neglected to push out an update of MicroScene1 with DIRSIG4 .scene files that were as simple to use as "unzip and use". Both updated scene files can be downloaded by registered users from the myDIRSIG website.

New Year, New DIRSIG

This release is primarily a maintenance release to address a few bugs in the 4.4.0 release. A "release candidate" is a preview of the next release of DIRSIG. This version of the software is produced when we feel the software is ready for release, but we would like some last-minute user feedback. Our early adopters always help spot a few lingering issues that our release quality control process misses. We expect the official version of 4.4.1 to be released around Jan 14th, 2011 The following is a summary of fixes and features added in this release: First release to include a Windows 64-bit version Improvements to correlated deviate models used for jitter The frequency spectrum can now (optionally) include the phase Various LIDAR related improvements Speedups for atmospheric backscatter returns Bug fix for Linear-mode APD LIDAR detector model Various GUI bug fixes and improvements Improvements to streamline integration with SUMO and CityEngine SUMO v