About Me

Francais

Biography

Emmanuel Christophe received both his Engineering degree in Telecommunications (ENST Bretagne) and DEA (University of Rennes 1) in Telecommunications and image processing with honors in 2003. He worked six months for the National University of Singapore on CWAIP lab on video compression. In October 2006, he received the PhD degree from Supaero in hyperspectral image compression and image quality for his work in TeSA in cooperation with Centre National d’Etudes Spatiales (CNES), Office National d’Etudes et de Recherches Aérospatiales (ONERA) and Alcatel Space. He has been a visiting scholar in 2006 at Rensselaer Polytechnic Institute, Troy, NY, USA. From 2006 to 2008, he was a research engineer at CNES, the French Space Agency, focusing on information extraction for high resolution optical images. Since that time, he is also deeply involved in the development of the open-source OTB. In October 2008, he moved to Singapore at CRISP, National University of Singapore, where he was discovering new challenges for remote sensing in tropical areas and InSAR earthquake monitoring. Finally, in November 2010, he joined Google in California, playing with big data and thousands of computers on machine learning.

Interest

Teaching

  • Insa Toulouse: Satellite image compression for 4th year students (download course).
  • Enseeiht: Introduction to signal processing and spectral analysis and supervised several student projects
  • Ensica: Digital filtering.
  • Cnam: Course on signal and noise.

Awards

  • IEEE Geoscience and Remote Sensing Society award for the 2008 Data Fusion Contest

Selected publications

Academic Journals

  1. E. Christophe, J. Michel, and J. Inglada, "Remote sensing processing: from multicore to GPU," IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing. 2011
  2. G. Licciardi, F. Pacifici, D. Tuia, S. Prasad, T. West, F. Giacco, J. Inglada, E. Christophe, J. Chanussot, and P. Gamba, "Decision fusion for the classification of hyperspectral data: Outcome of the 2008 GRS-S Data Fusion Contest," IEEE Transactions on Geoscience and Remote Sensing, vol. 47, pp. 3857-3865, nov. 2009
  3. E. Christophe and J. Inglada, "Open Source Remote Sensing: Increasing the Usability of Cutting-Edge Algorithms," IEEE Geoscience and Remote Sensing Newsletter, pp. 9-15, mar. 2009
    One common problem when working with satellite images is the gap between cutting edge algorithms available in the literature and methods which are available in production software. Constraints on production are often strong and do not cope well with research algorithms. To help a faster transfer of the new algorithms to real production situations a strong and robust software architecture is required. One of the main objectives of the Orfeo Toolbox (OTB) is to provide such a framework to help the scalability of newly implemented algorithms and relieving (at least partially) the researcher from such concerns. OTB is valuable for all people working in the remote sensing imagery community. Releasing it under an open source license, CNES (French Space Agency) hopes to benefit from contributions of many specialists to help grow the practical use of satellite imagery. The first feedback has been very positive and open source development seems to be particularly suitable to increase the usability of cutting-edge algorithms.
  4. E. Christophe, C. Mailhes, and P. Duhamel, "Hyperspectral Image Compression: Adapting SPIHT and EZW to Anisotropic 3D Wavelet Coding," IEEE Transactions on Image Processing, vol. 17, iss. 12, pp. 2334-2346, dec. 2008
    Hyperspectral images present some specific characteristics that should be used by an efficient compression system. In compression, wavelets have shown a good adaptability to a wide range of data, while being of reasonable complexity. Some wavelet-based compression algorithms have been successfully used for some hyperspectral space missions. This paper focuses on the optimization of a full wavelet compression system for hyperspectral images. Each step of the compression algorithm is studied and optimized. First, an algorithm to find the optimal 3D wavelet decomposition in a rate-distortion sense is defined. Then, it is shown that a specific fixed decomposition has almost the same performance, while being more useful in terms of complexity issues. It is shown that this decomposition significantly improves the classical isotropic decomposition. One of the most useful properties of this fixed decomposition is that it allows the use of zero tree algorithms. Various tree structures, creating a relationship between coefficients, are compared. Two efficient compression methods based on zerotree coding (EZW and SPIHT) are adapted on this near-optimal decomposition with the best tree structure found. Performances are compared with the adaptation of JPEG 2000 for hyperspectral images on six different areas presenting different statistical properties.
  5. E. Christophe and W. A. Pearlman, "Three-dimensional SPIHT Coding of Volume Images with Random Access and Resolution Scalability," EURASIP Journal on Image and Video Processing. 2008
    End users of large volume image datasets are often interested only in certain features that can be identified as quickly as possible. For hyperspectral data, these features could reside only in certain ranges of spectral bands and certain spatial areas of the target. The same holds true for volume medical images for a certain volume region of the subject’s anatomy. High spatial resolution may be the ultimate requirement, but in many cases a lower resolution would suffice, especially when rapid acquisition and browsing are essential. This paper presents a major extension of the 3D-SPIHT (set partitioning in hierarchical trees) image compression algorithm that enables random access decoding of any specified region of the image volume at a given spatial resolution and given bit rate from a single codestream. Final spatial and spectral (or axial) resolutions are chosen independently. Because the image wavelet transform is encoded in tree blocks and the bit rates of these tree blocks are minimized through a rate-distortion optimization procedure, the various resolutions and qualities of the images can be extracted while reading a minimum amount of bits from the coded data. The attributes and efficiency of this 3D-SPIHT extension are demonstrated for several medical and hyperspectral images in comparison to the JPEG2000 Multicomponent algorithm.
  6. E. Christophe, P. Duhamel, and C. Mailhes, "Adaptation of zerotrees using signed binary digit representations for 3 dimensional image coding," EURASIP Journal on Image and Video Processing. 2007
    Zerotrees of wavelet coefficients have shown a good adaptability for the compression of three dimensional images. EZW, the original algorithm using zerotree, shows good performance and was successfully adapted to 3D image compression. This paper focuses on the adaptation of EZW for the compression of hyperspectral images. The subordinate pass is suppressed to remove the necessity to keep the significant pixels in memory. To compensate the loss due to this removal, signed binary digit representations are used to increase the efficiency of zerotrees. Contextual arithmetic coding with very limited contexts is also used. Finally, we show that this simplified version of 3D-EZW performs almost as well as the original one.
  7. E. Christophe, D. Léger, and C. Mailhes, "Quality Criteria Benchmark for Hyperspectral Imagery," IEEE Transactions on Geoscience and Remote Sensing, vol. 43, iss. 09, pp. 2103-2114, sep. 2005
    Hyperspectral data appear to be of a growing interest over the past few years. However, applications for hyperspectral data are still in their infancy as handling the significant size of the data presents a challenge for the user community. Efficient compression techniques are required, and lossy compression, specifically, will have a role to play, provided its impact on remote sensing applications remains insignificant. To assess the data quality, suitable distortion measures relevant to end-user applications are required. Quality criteria are also of a major interest for the conception and development of new sensors to define their requirements and specifications. This paper proposes a method to evaluate quality criteria in the context of hyperspectral images. The purpose is to provide quality criteria relevant to the impact of degradations on several classification applications. Different quality criteria are considered. Some are traditionnally used in image and video coding and are adapted here to hyperspectral images. Others are specific to hyperspectral data.We also propose the adaptation of two advanced criteria in the presence of different simulated degradations on AVIRIS hyperspectral images. Finally, five criteria are selected to give an accurate representation of the nature and the level of the degradation affecting hyperspectral data.

International Conferences

  1. C. Gauchet, E. Christophe, A. S. Chia, and T. Yin, "InSAR monitoring of the Lusi mud volcano, East Java, from 2006 to 2010," in IEEE International Geoscience and Remote Sensing Symposium, IGARSS’11, Vancouver, BC, Canada, jul 2011.
  2. E. Christophe, M. Grizonnet, J. Michel, J. Inglada, and T. Dhar, "Bringing the raster processing algorithms of the Orfeo Toolbox Monteverdi in QGIS," in Free and Open Source Software for Geospatial, FOSS4G 2010, Barcelona, Spain, sep 2010.
  3. E. Christophe, A. S. Chia, T. Yin, and L. K. Kwoh, "2009 earthquakes in Sumatra: the use of L-band interferometry in a SAR-hostile environment," in IEEE International Geoscience and Remote Sensing Symposium, IGARSS’10, Honolulu, HI, USA, jul 2010.
  4. E. Christophe, C. M. Wong, and S. C. Liew, "Mangrove detection from high resolution optical data," in IEEE International Geoscience and Remote Sensing Symposium, IGARSS’10, Honolulu, HI, USA, jul 2010.
  5. T. Yin, E. Christophe, S. C. Liew, and S. H. Ong, "Iterative calibration of relative platform position: a new method for SAR baseline estimation," in IEEE International Geoscience and Remote Sensing Symposium, IGARSS’10, Honolulu, HI, USA, jul 2010.
  6. E. Christophe, J. Inglada, and J. Maudlin, "Crowd-sourcing satellite image analysis," in IEEE International Geoscience and Remote Sensing Symposium, IGARSS’10, Honolulu, HI, USA, jul 2010.
  7. E. Christophe and J. Inglada, "Orfeo Toolbox: from satellite images to geographic information," in Free and Open Source Software for Geospatial, FOSS4G 2009, Sydney, Australia, oct 2009.
  8. J. Osman, J. Inglada, and E. Christophe, "Interactive object segmentation in high resolution satellite images," in IEEE International Geoscience and Remote Sensing Symposium, IGARSS’09, Cape Town, South Africa, jul 2009.
  9. J. Inglada and E. Christophe, "The Orfeo toolbox remote sensing image processing software," in IEEE International Geoscience and Remote Sensing Symposium, IGARSS’09, Cape Town, South Africa, jul 2009.
  10. E. Christophe and J. Inglada, "Object counting in high resolution remote sensing images with OTB," in IEEE International Geoscience and Remote Sensing Symposium, IGARSS’09, Cape Town, South Africa, jul 2009.
  11. S. Aksoy, B. Ozdemir, S. Eckert, F. Kayitakire, M. Pesarasi, O. Aytekin, C. C. Borel, J. Cech, E. Christophe, S. Duzgun, A. Erener, K. Ertugay, E. Hussain, J. Inglada, S. Lefevre, O. Ok, D. K. San, R. Sara, J. Shan, J. Soman, I. Ulusoy, and R. Witz, "Performance evaluation of building detection and digital surface model extraction algorithms: Outcomes of the PRRS 2008 Algorithm Performance Contest," in IAPR Workshop on Pattern Recognition in Remote Sensing (PRRS 2008), Tampa, FL, USA, dec 2008.
  12. X. Delaunay, E. Christophe, C. Thiebaut, and V. Charvillat, "Best post-transforms selection in a rate-distortion sense," in IEEE International Conference on Image Processing, ICIP’08, San Diego, CA, USA, oct 2008.
    This paper deals with the optimization of a new technique of image compression. After the wavelet transform of an image, blocks of coefficients are further linearly decomposed using a basis selected in a dictionary. This dictionary is known by both the encoder and the decoder. This approach is a generalization of the bandelet transform. This paper investigates the problem of the best basis selection. On each block of wavelet coefficients, this selection is made by minimization of a Lagrangian rate-distortion criterion. Theoretical expressions of the optimal Lagrangian multiplier can be computed based on asymptotic hypotheses. A nearly exhaustive search of the optimal Lagrangian multiplier is done for the compression of high resolution satellite images. This numerical study validates the asymptotic theoretical expressions but as well provides a refined expression of the Lagrangian multiplier. At last, the compression results obtained using those different expressions are compared to the optimal compression results obtained with the exhaustive search.
  13. E. Christophe, C. Thiebaut, and C. Latry, "Compression specification for efficient use of high resolution satellite data," in The XXI Congress, The International Society for Photogrammetry and Remote Sensing, ISPRS’08, Beijing, China, jul 2008, pp. 1283-1286.
    For an efficient usage and distribution of high resolution satellite images, several problems need to be solved. The issue comes with the increasing size of these data. Constrains are different than those of the on-board compression, thus different solutions can be selected. For on-board compression, the main constrains are the computational complexity and the rate attainable with qualified space equipments. For on-the-ground compression, computational constraints are not so strong, but particular care is needed to make sure that the chosen format is widely spread and that users will be able to exploit these data easily.
  14. E. Christophe, D. Léger, and C. Mailhes, "New Quality Representation for Hyperspectral Images," in The XXI Congress, The International Society for Photogrammetry and Remote Sensing, ISPRS’08, Beijing, China, jul 2008, pp. 315-320.
    Assessing the quality of a hyperspectral image is a difficult task. However, this assessment is required at different levels of the instrument design: evaluation of the signal to noise ratio necessary for a particular application, determining the acceptable level of losses from compression algorithms for example. It has been shown previously that a combination of five quality criteria can provide a good evaluation of the impact of some degradation on applications, such as classification algorithms for example. This paper refines this concept, providing a representation of the degradation which allows predicting the impact on application
  15. E. Christophe, J. Inglada, and A. Giros, "ORFEO Toolbox: a complete solution for mapping from high resolution satellite images," in The XXI Congress, The International Society for Photogrammetry and Remote Sensing, ISPRS’08, Beijing, China, jul 2008, pp. 1263-1268.
    One of the main objectives of the Orfeo Toolbox (OTB) is the definition and the development of tools for the operational exploitation of the future sub-metric optic and radar images (rapid mapping, tridimensional aspects, change detection, texture analysis, pattern matching, optic and radar complementarities). The purpose of the OTB is to capitalize a methodological know-how in order to adopt an incremental development approach aiming to efficiently exploit the results obtained by research studies. OTB is interesting for all people working in the remote sensing imagery community. Releasing it under an open source licence, CNES hopes to benefit from contributions of many specialists to help grow the practical use of satellite imagery.
  16. E. Christophe and J. Inglada, "Robust Road Extraction for High Resolution Satellite Images," in IEEE International Conference on Image Processing, ICIP’07, San Antonio, TX, USA, sep 2007.
    Automatic road extraction is a critical feature for an efficient use of remote sensing imagery in most contexts. This paper proposes a robust geometric method to provide a first step extraction level. These results can be used as an initialization for other algorithms or as a starting point for manual road extraction. Results of the extraction are vectorized for GIS integration and for a better interaction with human experts that can refine the results. The algorithm is fast, has very few parameters and is only slightly affected by the image properties (resolution, noise). The algorithm is available in the open-source Orfeo Toolbox.
  17. C. Thiebaut, E. Christophe, D. Lebedeff, and C. Latry, "CNES Studies of On-Board Compression for Multispectral and Hyperspectral Images," in SPIE, Satellite Data Compression, Communications, and Archiving III, San Diego, CA, USA, aug 2007.
    Future high resolution instruments planned by CNES for space remote sensing missions will lead to higher bit rates because of the increase in resolution, dynamic range and number of spectral channels for multispectral (up to 16 bands) and hyperspectral (hundreds of bands) imagery. Lossy data compression is then needed, with compression ratio goals always higher and with low-complexity algorithm. For optimum compression performance of such data, algorithms must exploit both spectral and spatial correlation. In the case of multispectral images, CNES (in cooperation with Thales Alenia Space, hereafter TAS) studies have led to an algorithm using a fixed transform to decorrelate the spectral bands, the CCSDS codec compresses each decorrelated band using a suitable multispectral rate allocation procedure. This lowcomplexity decorrelator is adapted to hardware implementation on-board satellite and is under development. In the case of hyperspectral images, CNES (in cooperation with TAS/TeSA/ONERA) studies have led to a full wavelet compression system followed by zerotree coding methods adapted to this decomposition. We are investigating other preprocessors such as Independent Component Analysis which could be used in both approaches. CNES also participates to the new CCSDS Multispectral and Hyperspectral Data Compression Working Group.
  18. E. Christophe, P. Duhamel, and C. Mailhes, "Signed Binary Digit Representation to Simplify 3D-EZW," in IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP’07, Honolulu, HI, USA, apr 2007.
    Zerotree based coders have shown a good ability to be successfully adapted to 3D image coding. This paper focuses on the adaptation of EZW for the compression of hyperspectral images with reduced complexity. The subordinate pass is removed so that the location of significant coefficients does not need to be kept in memory. To compensate the quality loss due to this removal, a signed binary digit representation is used to increase the efficiency of zerotrees. Contextual arithmetic coding with very limited contexts is also used. Finally, we show that this simplified version of 3D-EZWperforms almost as well as the original one.
  19. E. Christophe and W. A. Pearlman, "Three-dimensional SPIHT Coding of Hyperspectral Images with Random Access and Resolution Scalability," in Fortieth Annual Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA, USA, oct 2006, pp. 1897-1901.
    With the increase of remote sensing images, fast access to some features of the image is becoming critical. This access could be some part of the spectrum, some area of the image, high spatial resolution. An adaptation of 3D-SPIHT image compression algorithm is presented to allow random access to some part of the image, whether spatial or spectral. Resolution scalability is also available, enabling the decoding of different resolution images from the compressed bitstream of the hyperspectral data. Final spatial and spectral resolutions are chosen independently. From the same compressed bitstream, various resolutions and quality images can be extracted while reading a minimum amount of bits from the coded data. All this is done while reducing the memory necessary during the compression.
  20. E. Christophe, C. Mailhes, and P. Duhamel, "Best Anisotropic 3-D wavelet decomposition in a rate-distortion sense," in IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP’06, Toulouse, France, may 2006, p. ii-17.
    Hyperspectral sensors have been of a growing interest over the past few decades for Earth observation as well as deep space exploration. However, the amount of data provided by such sensors requires an efficient compression system which is yet to be defined. It is hoped that the particular statistical properties of such images can be used to obtain very efficient compression algorithms. This paper proposes a method to find the most suitable wavelet decomposition for hyperspectral images and introduces the possibility of non isotropic decomposition. The decomposition is made by choosing the decomposition that provides an optimal rate-distortion trade-off. The obtained decomposition exhibits better performances in terms of ratedistortion curves compared to isotropic decomposition for high bitrates as well as for low bitrates.
  21. E. Christophe, D. Léger, and C. Mailhes, "Comparison and evaluation of quality criteria for hyperspectral imagery," in SPIE, Image Quality and System Performance II, San Jose, CA, USA, jan 2005, pp. 204-213.
    Hyperspectral data appears to be of a growing interest over the past few years. However, applications for hyperspectral data are still in their infancy. Handling the significant size of hyperspectral data presents a challenge for the user community. To enable efficient data compression without losing the potentiality of hyperspectral data, the notion of data quality is crucial for the development of applications. To assess the data quality, quality criteria relevent to end-user applications are required. This paper proposes a method to evaluate quality criteria. The purpose is to provide quality criteria corresponding well to the impact of degradation on end-user applications. Several quality criteria adapted to hyperspectral context are evaluated. Finally, five criteria are selected to give a good representation of the degradation nature and level affecting hyperspectral data.

Thesis

  1. E. Christophe, "Compression des images hyperspectrales et son impact sur la qualité des données," PhD Thesis , 2006.
    Les images hyperspectrales pr\’esentent des caract\’eristiques sp\’ecifiques qui demandent \’e \’etre exploit\’ees par un algorithme de compression efficace. Cette th\’ese se consacre \’e la d\’efinition d’un syst\’eme complet de compression pour les images hyperspectrales. En compression, les ondelettes ont montr\’e une bonne efficacit\’e sur des donn\’ees diverses tout en conservant une complexit\’e raisonnable. De plus, le codage des coefficients d’ondelettes est souvent assur\’e par des algorithmes bas\’es sur le principe des arbres de z\’eros qui sont, \’e l’heure actuelle, parmi les plus efficaces. C’est pourquoi ce travail de th\’ese s’est int\’eress\’e \’e d\’efinir dans un premier temps une d\’ecomposition en ondelettes quasi-optimale au sens d’un crit\’ere d\’ebit-distorsion pour les images hyperspectrales. Dans un deuxi\’eme temps, nous proposons une adaptation des m\’ethodes de codage par arbres (EZW, SPIHT), associ\’ees \’e la d\’ecomposition obtenue. Les performances sont compar\’ees \’e une adaptation de JPEG 2000 pour les images hyperspectrales, d\’emontrant l’int\’er\’et des m\’ethodes propos\’ees. D’autre part, les utilisateurs des images hyperspectrales sont souvent int\’eress\’es uniquement par certaines caract\’eristiques de l’image (r\’esolution ou zone) en fonction de l’application. Une adaptation de l’algorithme pr\’ec\’edent est r\’ealis\’ee pour permettre l’acc\’es al\’eatoire afin de d\’ecoder uniquement une partie de l’image (en spatial ou en spectral). La progression en r\’esolution est \’egalement disponible permettant de d\’ecoder des images \’e diff\’erentes r\’esolutions \’e partir du m\’eme train binaire tout en lisant un minimum de bits. Il est \’egalement possible de sp\’ecifier de mani\’ere ind\’ependante les r\’esolutions spatiales et spectrales souhait\’ees. Enfin, on ne peut parler de compression (avec pertes) sans d\’efinir au pr\’ealable un crit\’ere de distorsion adapt\’e. Nous d\’efinissons ainsi un groupe de cinq crit\’eres de qualit\’e pr\’esentant une bonne compl\’ementarit\’e. Ces cinq crit\’eres ont \’et\’e choisis afin de pouvoir s’assurer que l’algorithme de compression n’entraine pas des d\’egradations compromettant l’int\’er\’et des donn\’ees. Une nouvelle m\’ethode d’utilisation de ces cinq crit\’eres de qualit\’e montre une bonne aptitude \’e distinguer diff\’erentes d\’egradations produites sur l’image. Cette m\’ethode, appliqu\’ee au nouvel algorithme de compression montre que les d\’egradations restent faibles pour des d\’ebits autour de 1 bit par pixel par bande. Hyperspectral images present some specific characteristics that should be used by an efficient compression system. This thesis focuses on the definition and the optimization of a full wavelet compression system for hyperspectral images. In compression, wavelets have shown a good adaptability to a wide range of data, while being of reasonable complexity. Zerotree based compression algorithms are among the best for image compression. Therefore, in this work, efficient compression methods based on zerotree coding (EZW, SPIHT) are adapted on a near-optimal wavelet decomposition for hyperspectral images. Performances are compared with the adaptation of JPEG 2000 for hyperspectral images. End users of hyperspectral images are often interested only in some specific features of the image (resolution, location) which depend on the application. A further adaptation of the proposed hyperspectral image compression algorithm is presented to allow random access to some part of the image, whether spatial or spectral. Resolution scalability is also available, enabling the decoding of different resolution images from the compressed bitstream of the hyperspectral data while reading a minimum amount of bits from the coded data. Final spatial and spectral resolutions are chosen independantly. Finally, any lossless compression method cannot be characterized without the definition of a distortion measure. Therefore, a group of five quality criteria presenting a good complementarity is defined. The purpose is to make sure the compression algorithm does not impact significantly the data quality. A new method using these five criteria shows a good ability to discriminate between different degradations. Application of this method to the newly defined algorithm shows that the degradation remains low for compression rate around 1.0 bit per pixel per band.

National Conferences and Workshops

  1. J. Inglada, M. Grizonnet, E. Christophe, and J. Michel, "The Quantitative Remote Sensing Tools in Orfeo Toolbox," in Recent Advances in Quantitative Remote Sensing, RAQRS’III, Valencia, Spain, sep 2010.
  2. E. Christophe, "Remote Sensing and ITK: The Orfeo Toolbox," in The Insight Toolkit 2010 Workshop: ITK Past, Present, and Future, Bethesda, MD, USA, jun 2010.
  3. E. Christophe, "SAR Interferometry for volcanoes," in USGS Workshop on Remote Sensing and GIS Modeling on Volcanoes, Singapore, mar 2010.
  4. E. Christophe, C. Thiebaut, W. A. Pearlman, C. Latry, and D. Lebedeff, "Zerotree-Based Compression Algorithm for Spaceborne Hyperspectral Sensor," in ESA, On-Board Payload Data Compression Workshop, OBPDC 2008, Noordwijk, Netherlands, jun 2008.
  5. X. Delaunay, C. Thiebaut, E. Christophe, R. Ruiloba, M. Chabert, V. Charvillat, and G. Morin, "Lossy Compression by Post-Transforms in the Wavelet Domain," in ESA, On-Board Payload Data Compression Workshop, OBPDC 2008, Noordwijk, Netherlands, jun 2008.
  6. J. -M. Gaucel, E. Christophe, C. Pierangelo, C. Thiebaut, Y. Bobichon, E. Pequignot, F. Lemasson, and D. Lebedeff, "Analysis of Lossy and Lossless Compression Approaches for Futur Ultraspectral Sounder Missions," in ESA, On-Board Payload Data Compression Workshop, OBPDC 2008, Noordwijk, Netherlands, jun 2008.
  7. J. Inglada, E. Christophe, and H. de Boissezon, "From satellite images to operational applications: generic tools for specific user needs," in Space Appli 2008, Toulouse, France, apr 2008.
  8. J. Inglada, E. Christophe, S. Marzocchi-Polizzi, H. de Boissezon, and B. Boissin, "ORFEO Tool Box: an open source library of image processing algorithms for SAR and optical high resolution images," in ESA-EUSC 2008: Image Information Mining: pursuing automation of geospatial intelligence for environment and security, Frascati, Italy, mar 2008.
  9. E. Christophe, "Fast and robust algorithm for road extraction in high resolution optical images," in 7th CNES/DLR Workshop on Information Extraction and Scene Understanding for Meter Resolution Images, Oberpfaffenhofen, Germany, mar 2007.
  10. E. Christophe, C. Mailhes, and P. Duhamel, "Décorrelation des images hyperspectrales avec une décomposition 3D en ondelettes," in Workshop on Transform Based on Independent Component Analysis for Audio, Video and Hyperspectral Images Data Reduction and Coding, Paris, France, jul 2006.
    La quantit\’e de donn\’ees produite par les capteurs hyperspectraux n\’ecessite un algorithme de compression efficace qui reste \’e d\’efinir. Les propri\’et\’es statistiques particuli\’eres devraient permettre d’obtenir des algorithmes de compression efficaces. \’etant donn\’ees ses propri\’et\’es et sa faible complexit\’e, la transform\’ee en ondelettes est un candidat prometteur pour la d\’ecorr\’elation des images hyperspectrales. Ce papier propose une m\’ethode pour trouver la d\’ecomposition en ondelettes optimale pour les images hyperspectrales et introduit la possibilit\’e d’une d\’ecomposition non isotropique. La d\’ecomposition donnant le meilleur compromis d\’ebit-distortion est choisie. Cette d\’ecomposition donne de bien meilleures performances en terme de d\’ebit-distortion que la d\’ecomposition isotropique classique. L’inconv\’enient de cette d\’ecomposition optimale r\’eside dans sa complexit\’e importante. Une seconde d\’ecomposition, fixe cette fois, est d\’efinie et montre des performances quasi optimales tout en gardant une complexit\’e faible.
  11. E. Christophe, D. Léger, and C. Mailhes, "Images hyperspectrales et critères qualité," in Colloque annuel des Doctorants EDIT’05, Toulouse, France, apr 2005, pp. 132-136.
    L’int\’er\’et pour les donn\’ees hyperspectrales est croissant au cours des derni\’eres ann\’ees. Traiter ces quantit\’es de donn\’ees importantes pr\’esente un d\’efi et la plupart des applications sont encore en d\’eveloppement. Pour d\’efinir un syst\’eme de compression de donn\’ees efficace sans perdre le potentiel important de ces images, la d\’efinition de crit\’eres qualit\’e est une \’etape indispensable. Cet article pr\’esente une m\’ethode pour valider des crit\’eres qualit\’e en fonction de leur capacit\’e \’e pr\’edire l’influence de d\’egradation sur les applications des utilisateurs finaux. Un nombre important de crit\’ere qualit\’e est d\’efini, puis \’evalu\’e. Finalement, cinq crit\’eres sont retenus pour donner une bonne estimation de l’impact des d\’egradations sur des applications classiques de classification.

Book chapter

  1. E. Christophe, "Hyperspectral Data Compression Tradeoff," , Prasad, S., Bruce, L. M., and Chanussot, J., Eds., Springer-Verlag Berlin Heidelberg, 2011.

Patent

  1. E. Cansot, E. Christophe, and A. Rosak, Process and instrument for reconstruction of an irregularly sampled narrow-band signal, WO2010043442/FR2937160, 2010.

IEEE papers above are subjected to the following copyright:
“©20xx IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.”

Eurasip papers above are on Creative Common Attribution Licence

Leave a Reply

Your email address will not be published. Required fields are marked *