Difference between revisions of "Decimation"

From MC Wiki
Jump to navigationJump to search
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
== It started with Jon Zarge wanting to generate models for our in-house biomagnitism project. ==
 
== It started with Jon Zarge wanting to generate models for our in-house biomagnitism project. ==
  
Jon was a member of the Software Technology Program (STP) at CRD. The STP recruited high potential candidates with Bachelor's degrees and offered them an opportunity to get a Masters degree at a local university (usually Rensselaer Polytechnic Institute). While they worked on the degree at Company expanse, they rotated through two or three projects within CRD. Jon was assigned to our group to work on a new medical imaging modality, biomagnetism. One of the challenging prblems of this project was to solve an inverse problem. From measurements obtained on the scalp of the patient, the inverse problem was to find the source of the biomagnetic field. Most researchers used idealized, spherical models of the skull and brain. The biomag team set out to generate patient specific models. Their first choice for model generation was, of course, Marching Cubes. Marching Cubes posed two problems however. First, the models generated by Marching Cubes were huge in polygon count. Second, as we found out, the models had topology problemsi. Jon's task was to generate models that could be analyzed. If I recall, we did not actually have the code to do the analysis.
+
Jon was a member of the Software Technology Program (STP) at CRD. The STP recruited high potential candidates with Bachelor's degrees and offered them an opportunity to get a Masters degree at a local university (usually Rensselaer Polytechnic Institute). While they worked on the degree at Company expense, they rotated through two or three projects within CRD. Jon was assigned to our group to work on a new medical imaging modality, biomagnetism. One of the challenging problems of this project was to solve an inverse problem. From measurements obtained on the scalp of the patient, the inverse problem was to find the source of the biomagnetic field. Most researchers used idealized, spherical models of the skull and brain. The biomag team set out to generate patient specific models. Their first choice for model generation was, of course, Marching Cubes. Marching Cubes posed two problems however. First, the models generated by Marching Cubes were huge in polygon count. Second, as we found out, the models had topology problems. Jon's task was to generate models that could be analysed. If I recall, we did not actually have the code to do the analysis.
  
 
Jon was being mentored by Will Schroeder. Will laid out a high level approach to solve the problem. The approach eliminated single vertices according to some out of plane criterion. Then the neighborhood of the removed vertex was re-triangulated. Will had strong background in polygonal model topology. Step one of the process was to determine and encode the topology of the triangular surface. Jon wrote code to build a topological data structure to enable the vertex remove, hole fill approach. This is when Jon brought the severity of the Marching Cubes topology problem to my attention. This had already been pointed out by Durst's letter to the ACM Siggraph Quarterly. To be honest, I had tired to verify the extent of Durst's discovery but did not have the right tools. Jon's topology builder was just what I needed. While he worked on refining the initial decimation implementation, I worked on fixing the hole problem in Marching Cubes.
 
Jon was being mentored by Will Schroeder. Will laid out a high level approach to solve the problem. The approach eliminated single vertices according to some out of plane criterion. Then the neighborhood of the removed vertex was re-triangulated. Will had strong background in polygonal model topology. Step one of the process was to determine and encode the topology of the triangular surface. Jon wrote code to build a topological data structure to enable the vertex remove, hole fill approach. This is when Jon brought the severity of the Marching Cubes topology problem to my attention. This had already been pointed out by Durst's letter to the ACM Siggraph Quarterly. To be honest, I had tired to verify the extent of Durst's discovery but did not have the right tools. Jon's topology builder was just what I needed. While he worked on refining the initial decimation implementation, I worked on fixing the hole problem in Marching Cubes.
The topology problem was easier to solve than I had anticipated. The root of the problem was my early assumption that all of the complimentary cases, cases where inside/outside was reversed, were the same. Once I realized that the complimentary cases needed different treatment, I was able to resolve the ambiguity. Fortunately, my early code to generate the cases had separate inputs for the complimentary cases. I generated a new case table, tested some generated models against Jon's topology checker and felt confident I had solved the problem.
+
The topology problem was easier to solve than I had anticipated. The root of the problem was my early assumption that all of the complimentary cases, cases where inside/outside was reversed, were the same. Once I realized that the complimentary cases needed different treatment, I was able to resolve the ambiguity. Fortunately, my early code to generate the cases had separate inputs for the complimentary cases. I generated a new case table, tested some generated models against Jon's topology checker and felt confident I had solved the problem.
  
Jon's implementation was in C. I recall that he called the algorithm intelligent triangle decimation. At the time, our group was using LYMB as its development and delivery platform. Like many large systems, LYMB had a bit of a learning curve. The biomag group needed a quick implementation of a decimation algorithm and Jon was not a LYMB expert. We decided to let him work out the details of decimation outside of the LYMB system. Will reimplemented Jon's stand-alone algorithm in LYMB. In the process of defining the object-oriented design, Will was able to generalize the decimation algorithm. Generalization of algorithms is a common theme followed by our group over the years.
+
Jon's implementation was in C. I recall that he called the algorithm ''intelligent triangle decimation''. At the time, our group was using LYMB as its development and delivery platform. Like many large systems, LYMB had a bit of a learning curve. The Biomag group needed a quick implementation of a decimation algorithm and Jon was not a LYMB expert. We decided to let him work out the details of decimation outside of the LYMB system. Will reimplemented Jon's stand-alone algorithm in LYMB. In the process of defining the object-oriented design, Will was able to generalize the decimation algorithm. Generalization of algorithms is a common theme followed by our group over the years.
  
I was a bit on the sidelines of the decimation algorithm development, concentrating on generating valid surfaces from Marching Cubes. I realized though, that this was a Siggraph quality algorithm. I joined Will and Jon to start writing a description of the algorithm. I searched the literature for surface reduction techniques. I was surprised to find that there was nothing published on this topic. I also took on the task of generating examples to illustrate the power of the algorithm. Successful Siggraph papers have two major components: innovative algorithms and compelling examples. I chose examples from medical imaging, industrial inspection and terrain modeling.
+
I was a bit on the sidelines of the decimation algorithm development, concentrating on generating valid surfaces from Marching Cubes. I realized though, that this was a Siggraph quality algorithm. I joined Will and Jon to start writing a description of the algorithm. I searched the literature for surface reduction techniques. I was surprised to find that there was nothing published on this topic. I also took on the task of generating examples to illustrate the power of the algorithm. Successful Siggraph papers have two major components: innovative algorithms and compelling examples. I chose examples from medical imaging, industrial inspection and terrain modelling.
  
Siggraph 1991 was held in ???.  Although we could not find other published work on polygon decimation, there were three papers accepted that year that dealt with reducing polygon count. Hughes Hoppe from University of Washington and Greg Turk from Georgia Tech had the other two papers in the session. Will gave the talk to a packed room. I recall the talk was not in the main meeting room. Siggraph had parallel sessions.
+
[http://www.siggraph.org/conference/ Siggraph 1992] was held in Chicago.  Although we could not find other published work on polygon decimation, there were three papers accepted that year that dealt with reducing polygon count. Hughes Hoppe from University of Washington and Greg Turk from Georgia Tech had the other two papers in the session. Will gave the talk to a packed room. I recall the talk was not in the main meeting room. Siggraph had parallel sessions.
  
The decimation algorithm had a large impact on my career.  The Siggraph paper is highly cited. But more than that, decimation made Marching Cubes a practical algorithm. It became a cornerstone of our model creation pipeline.
+
The decimation algorithm had a large impact on my career.  The [http://scholar.google.com/scholar?cites=15027901412930997819&as_sdt=5,33&sciodt=0,33&hl=en Siggraph paper] is highly cited. But more than that, decimation made Marching Cubes a practical algorithm. It became a cornerstone of our model creation pipeline.

Latest revision as of 16:15, 20 January 2012

It started with Jon Zarge wanting to generate models for our in-house biomagnitism project.

Jon was a member of the Software Technology Program (STP) at CRD. The STP recruited high potential candidates with Bachelor's degrees and offered them an opportunity to get a Masters degree at a local university (usually Rensselaer Polytechnic Institute). While they worked on the degree at Company expense, they rotated through two or three projects within CRD. Jon was assigned to our group to work on a new medical imaging modality, biomagnetism. One of the challenging problems of this project was to solve an inverse problem. From measurements obtained on the scalp of the patient, the inverse problem was to find the source of the biomagnetic field. Most researchers used idealized, spherical models of the skull and brain. The biomag team set out to generate patient specific models. Their first choice for model generation was, of course, Marching Cubes. Marching Cubes posed two problems however. First, the models generated by Marching Cubes were huge in polygon count. Second, as we found out, the models had topology problems. Jon's task was to generate models that could be analysed. If I recall, we did not actually have the code to do the analysis.

Jon was being mentored by Will Schroeder. Will laid out a high level approach to solve the problem. The approach eliminated single vertices according to some out of plane criterion. Then the neighborhood of the removed vertex was re-triangulated. Will had strong background in polygonal model topology. Step one of the process was to determine and encode the topology of the triangular surface. Jon wrote code to build a topological data structure to enable the vertex remove, hole fill approach. This is when Jon brought the severity of the Marching Cubes topology problem to my attention. This had already been pointed out by Durst's letter to the ACM Siggraph Quarterly. To be honest, I had tired to verify the extent of Durst's discovery but did not have the right tools. Jon's topology builder was just what I needed. While he worked on refining the initial decimation implementation, I worked on fixing the hole problem in Marching Cubes. The topology problem was easier to solve than I had anticipated. The root of the problem was my early assumption that all of the complimentary cases, cases where inside/outside was reversed, were the same. Once I realized that the complimentary cases needed different treatment, I was able to resolve the ambiguity. Fortunately, my early code to generate the cases had separate inputs for the complimentary cases. I generated a new case table, tested some generated models against Jon's topology checker and felt confident I had solved the problem.

Jon's implementation was in C. I recall that he called the algorithm intelligent triangle decimation. At the time, our group was using LYMB as its development and delivery platform. Like many large systems, LYMB had a bit of a learning curve. The Biomag group needed a quick implementation of a decimation algorithm and Jon was not a LYMB expert. We decided to let him work out the details of decimation outside of the LYMB system. Will reimplemented Jon's stand-alone algorithm in LYMB. In the process of defining the object-oriented design, Will was able to generalize the decimation algorithm. Generalization of algorithms is a common theme followed by our group over the years.

I was a bit on the sidelines of the decimation algorithm development, concentrating on generating valid surfaces from Marching Cubes. I realized though, that this was a Siggraph quality algorithm. I joined Will and Jon to start writing a description of the algorithm. I searched the literature for surface reduction techniques. I was surprised to find that there was nothing published on this topic. I also took on the task of generating examples to illustrate the power of the algorithm. Successful Siggraph papers have two major components: innovative algorithms and compelling examples. I chose examples from medical imaging, industrial inspection and terrain modelling.

Siggraph 1992 was held in Chicago. Although we could not find other published work on polygon decimation, there were three papers accepted that year that dealt with reducing polygon count. Hughes Hoppe from University of Washington and Greg Turk from Georgia Tech had the other two papers in the session. Will gave the talk to a packed room. I recall the talk was not in the main meeting room. Siggraph had parallel sessions.

The decimation algorithm had a large impact on my career. The Siggraph paper is highly cited. But more than that, decimation made Marching Cubes a practical algorithm. It became a cornerstone of our model creation pipeline.