[ Information ] [ Publications ] [Signal processing codes] [ Signal & Image Links ]
[ Main blog: A fortunate hive ] [ Blog: Information CLAde ] [ Personal links ]
[ SIVA Conferences ] [ Other conference links ] [ Journal rankings ]
[ Tutorial on 2D wavelets ] [ WITS: Where is the starlet? ]
If you cannot find anything more, look for something else (Bridget Fountain)
Si vous ne trouvez plus rien, cherchez autre chose (Brigitte Fontaine)

 Web www.laurent-duval.eu lcd.siva.free.fr

# publications (& other things public)

Co-working: collaborations, colleagues & co-authors
Collaborative projects: MajIC (E. Chouzenoux) + Amidex BIFROST (C. Chaux) + TraDe-OPT (H2020 ITN) + ANR IA Chair BRIDGEABLE (J.-C. Pesquet) + ANR Artificial Intelligence JCJC grant GraphIA (F. Malliaros) + ANR BRUIT-FM
Networks: Academia - ACM - AMiner/ArNetMiner - arXiv - biorXiv - blog - DBLP - Github - GoogleScholar - HAL - LinkedIn - Mastodon - Math Genealogy - MatlabCentral - ORCID - PublicationsList - PubMed/NCBI - Quora - ResearcherID/WoS - ResearchGate - StackExchange - Twitter - Viadeo

## Main topics

[Compression/Denoising/Deconvolution/Restoration/Learning]+[Time-frequency/time-scale/wavelets/filterbanks]+[Sparsity analysis]+[Analytical Chemistry data (chromatography/NMR]+[Geoscience/geophysical processing]+[Vibration/combustion in engines]+[Bioinformatics gene network inference & clustering]+[Seismic data and geological mesh compression]+[Real-time extrapolation]

# in limbo

• Local directional transforms for seismic data processing, Laurent Duval, Caroline Chaux, Jérôme Gauthier
• COLT (tunable redundancy complex Short-term Fourier Transform frames), Jérôme Gauthier, Laurent Duval
• Joint restoration/segmentation of multicomponent images with variationalBayes and higher-order graphical models (HOGMep) [hal]
• Half a century of Seismic Data Compression: an expanding history (see some seismic data compression papers, codes, links and folks)
• BRANE HK Cone (RNA-Seq data normalization)

# being written

• Dual Sparse Partial Least Squares: the dual.spls regression package (submitted, R package CRAN)
• CalValXy: well-balanced and stratified calibration/validation splitting using both predictors $X$ and response $y$ (to be submitted, Technometrics)
• On peaks, baseline and noise separation for analytical signals. Applications to chromatography
• BEADS unstrung: automated baseline and noise filtering guided by sparsity and positivity on analytical data
• SGrafilt (an update on the Savitzky-Golay filter)

# published

## Articles/Journal papers

1. PENDANTSS: PEnalized Norm-ratios Disentangling Additive Noise, Trend and Sparse Spikes (aka blind BEADS trend + SPOQ deconvolution). Paul Zheng, Émilie Chouzenoux, Laurent Duval, IEEE Signal Processing Letters, 2023 [doi] [arxiv] [pdf] [opus] [code]
Abstract: Denoising, detrending, deconvolution: usual restoration tasks, traditionally decoupled. Coupled formulations entail complex ill-posed inverse problems. We propose PENDANTSS for joint trend removal and blind deconvolution of sparse peak-like signals. It blends a parsimonious prior with the hypothesis that smooth trend and noise can somewhat be separated by low-pass filtering. We combine the generalized pseudo-norm ratio SOOT/SPOQ sparse penalties lp/lq with the BEADS ternary assisted source separation algorithm. This results in a both convergent and efficient tool, with a novel Trust-Region block alternating variable metric forward-backward approach. It outperforms comparable methods, when applied to typically peaked analytical chemistry signals. Reproducible code is provided.
2. BraneMF: Integration of Biological Networks for Functional Analysis of Proteins. Surabhi Jagtap, Abdulkadir Çelikkanat, Aurélie Pirayre, Frédérique Bidard, Laurent Duval, Fragkiskos D. Malliaros, Bioinformatics, December 2022 [doi] [pdf] [pdf] [opus] [code]
Motivation: The cellular system of a living organism is composed of interacting bio-molecules that control cellular processes at multiple levels. Their correspondences are represented by tightly regulated molecular networks. The increase of omics technologies has favored the generation of large-scale disparate data and the consequent demand for simultaneously using molecular and functional interaction networks: gene co-expression, protein-protein interaction (PPI), genetic interaction, and metabolic networks. They are rich sources of information at different molecular levels, and their effective integration is essential to understand cell functioning and their building blocks (proteins). Therefore, it is necessary to obtain informative representations of proteins and their proximity, that are not fully captured by features extracted directly from a single informational level. We propose BraneMF, a novel random walk-based matrix factorization method for learning node representation in a multilayer network, with application to omics data integration.
Results: We test BraneMF with PPI networks of Saccharomyces cerevisiae, a well-studied yeast model organism. We demonstrate the applicability of the learned features for essential multi-omics inference tasks: clustering, function and PPI prediction. We compare it to state-of-the-art integration methods for multilayer network. BraneMF outperforms baseline methods by achieving high prediction scores for a variety of downstream tasks. The robustness of results is assessed by an extensive parameter sensitivity analysis. Availability: BraneMF is freely available at: https://github.com/Surabhivj/BraneMF
3. BRANEnet: Embedding Multilayer Networks for Omics Data Integration. Surabhi Jagtap, Aurélie Pirayre, Frédérique Bidard, Laurent Duval, Fragkiskos D. Malliaros, BMC Bioinformatics, 2022 [doi] [pdf] [opus] [code]
Background: Gene expression is regulated at different molecular levels, including chromatin accessibility, transcription, RNA maturation, and transport. These regulatory mechanisms have strong connections with cellular metabolism. In order to study the cellular system and its functioning, omics data at each molecular level can be generated and efficiently integrated. Here, we propose BRANEnet, a novel multi-omics integration framework for multilayer heterogeneous networks. BRANEnet is an expressive, scalable, and versatile method to learn node embeddings, leveraging random walk information within a matrix factorization framework. Our goal is to efficiently integrate multi-omics data to study different regulatory aspects of multilayered processes that occur in organisms. We evaluate our framework using multi-omics data of Saccharomyces cerevisiae, a well-studied yeast model organism. Results: We test BRANEnet on transcriptomics (RNA-seq) and targeted metabolomics (NMR) data for wild-type yeast strain during a heat-shock time course of 0, 20, and 120 minutes. Our framework learns features for differentially expressed bio-molecules showing heat stress response. We demonstrate the applicability of the learned features for targeted omics inference tasks: transcription factor (TF)-target prediction, integrated omics network (ION) inference, and module identification. The performance of BRANEnet is compared with existing network integration methods. Our model outperforms baseline methods by achieving high prediction scores for a variety of downstream tasks
Keywords: multi-omics data; multilayer network; biological network integration; graph representation learning; regulatory network inference
4. Genome-wide TSS distribution in three related Clostridia with normalized Capp-Switch sequencing. Rémi Hocq, Surabhi Jagtap, Magali Boutard, Andrew C. Tolonen, Laurent Duval, Aurélie Pirayre, Nicolas Lopes Ferreira, François Wasels, Microbiology Spectrum, 2022 [doi] [pdf] [epub] [bib]
Abstract: Transcription initiation is a tightly regulated process that is crucial for many aspects of prokaryotic physiology. High-throughput transcription start site (TSS) mapping can shed light on global and local regulation of transcription initiation, which in turn may help us understand and predict microbial behavior. In this study, we used Capp-Switch sequencing to determine the TSS positions in the genomes of three model solventogenic clostridia: Clostridium acetobutylicum ATCC 824, C. beijerinckii DSM 6423, and C. beijerinckii NCIMB 8052. We first refined the approach by implementing a normalization pipeline accounting for gene expression, yielding a total of 12,114 mapped TSSs across the species. We further compared the distributions of these sites in the three strains. Results indicated similar distribution patterns at the genome scale, but also some sharp differences, such as for the butyryl-CoA synthesis operon, particularly when comparing C. acetobutylicum to the C. beijerinckii strains. Lastly, we found that promoter structure is generally poorly conserved between C. acetobutylicum and C. beijerinckii. A few conserved promoters across species are discussed, showing interesting examples of how TSS determination and comparison can improve our understanding of gene expression regulation at the transcript level. IMPORTANCE Solventogenic clostridia have been employed in industry for more than a century, initially being used in the acetone-butanol-ethanol (ABE) fermentation process for acetone and butanol production. Interest in these bacteria has recently increased in the context of green chemistry and sustainable development. However, our current understanding of their genomes and physiology limits their optimal use as industrial solvent production platforms. The gene regulatory mechanisms of solventogenesis are still only partly understood, impeding efforts to increase rates and yields. Genome-wide mapping of transcription start sites (TSSs) for three model solventogenic Clostridium strains is an important step toward understanding mechanisms of gene regulation in these industrially important bacteria.
Keywords: Industrial Microbiology; green chemistry; sustainable development; prokaryote; transcription start site (TSS); gene regulatory mechanisms; acetone-butanol-ethanol (ABE) fermentation
5. Sparse Signal Reconstruction for Nonlinear Model via Piecewise Rational Optimization. Arthur Marmin, Marc Castella, Jean-Christophe Pesquet, Laurent Duval. Signal Processing, February 2021 [doi] [pdf] [bib] [arxiv] [hal]
Abstract: We propose a method to reconstruct sparse signals degraded by a nonlinear distortion and acquired at a limited sampling rate. Our method formulates the reconstruction problem as a nonconvex minimization of the sum of a data fitting term and a penalization term. In contrast with most previous works which settle for approximated local solutions, we seek for a global solution to the obtained challenging nonconvex problem. Our global approach relies on the so-called Lasserre relaxation of polynomial optimization. We here specifically include in our approach the case of piecewise rational functions, which makes it possible to address a wide class of nonconvex exact and continuous relaxations of the $\ell_{0}$ penalization function. Additionally, we study the complexity of the optimization problem. It is shown how to use the structure of the problem to lighten the computational burden efficiently. Finally, numerical simulations illustrate the benefits of our method in terms of both global optimality and signal reconstruction.
Keywords: polynomial and rational optimization, global optimization, l0 penalization, sparse modelling
6. Glucose-lactose mixture feeds in industry-like conditions: a gene regulatory network analysis on the hyperproducing Trichoderma reesei strain Rut-C30. Aurélie Pirayre, Laurent Duval, Corinne Blugeon, Cyril Firmo, Sandrine Perrin, Etienne Jourdier, Antoine Margeot, Frédérique Bidard. BMC Genomics [doi] [biorxiv] [hal] [pdf] [bib]
Abstract: Background: The degradation of cellulose and hemicellulose molecules into simpler sugars such as glucose is part of the second generation biofuel production process. Hydrolysis of lignocellulosic substrates is usually performed by enzymes produced and secreted by the fungus Trichoderma reesei. Studies identifying transcription factors involved in the regulation of cellulase production have been conducted but no overview of the whole regulation network is available. A transcriptomic approach with mixtures of glucose and lactose, used as a substrate for cellulase induction, was used to help us decipher missing parts in the network. Results: Experimental results confirmed the impact of sugar mixture on the enzymatic cocktail composition. The transcriptomic study shows a temporal regulation of the main transcription factors and a lactose concentration impact on the transcriptional profile. A gene regulatory network (GRN) built using the BRANE Cut software reveals three sub-networks related to i) a positive correlation between lactose concentration and cellulase production, ii) a particular dependence of the lactose onto the beta-glucosidase regulation and iii) a negative regulation of the development process and growth. Conclusions: This work is the first investigating a transcriptomic study regarding the effects of pure and mixed carbon sources in a fed-batch mode. Our study expose a co-orchestration of xyr1, clr2 and ace3 for cellulase and hemicellulase induction and production, a fine regulation of the beta-glucosidase and a decrease of growth in favor of cellulase production. These conclusions provide us with potential targets for further genetic engineering leading to better cellulase-producing strains.
Keywords: Trichoderma reesei, carbon sources, cellulases, transcriptome, fed-batch fermentation, data science, Gene Regulatory Network.
7. SPOQ lp-Over-lq Regularization for Sparse Signal Recovery applied to Mass Spectrometry. Afef Cherni, Emilie Chouzenoux, Laurent Duval, Jean-Christophe Pesquet. IEEE Transactions on Signal Processing, 2020, Volume 68, pages 6070-6084 [opus] [code] [matlabcentral] [doi] [hal] [arxiv] [pdf] [bib] [SOOT]
Abstract: Underdetermined or ill-posed inverse problems require additional information for sound solutions with tractable optimization algorithms. Sparsity yields consequent heuristics to that matter, with numerous applications in signal restoration, image recovery, or machine learning. Since the l0 count measure is barely tractable, many statistical or learning approaches have invested in computable proxies, such as the l1 norm. However, the latter does not exhibit the desirable property of scale invariance for sparse data. Extending the SOOT Euclidean/Taxicab l1-over-l2 norm-ratio initially introduced for blind deconvolution, we propose SPOQ, a family of smoothed (approximately) scale-invariant penalty functions. It consists of a Lipschitz-differentiable surrogate for lp-over-lq quasi-norm/norm ratios with p in ]0,2[ and q>=2. This surrogate is embedded into a novel majorize-minimize trust-region approach, generalizing the variable metric forward-backward algorithm. For naturally sparse mass-spectrometry signals, we show that SPOQ significantly outperforms l0, l1, Cauchy, Welsch, SCAD and Cel0 penalties on several performance measures. Guidelines on SPOQ hyperparameters tuning are also provided, suggesting simple data-driven choices.
Keywords: Inverse problems, majorize-minimize method, mass spectrometry, nonconvex optimization, nonsmooth optimization, norm ratio, quasinorm, sparsity.
8. HexaShrink, an exact scalable framework for hexahedral meshes with attributes and discontinuities: multiresolution rendering and storage of geoscience models, Jean-Luc Peyrot , Laurent Duval, Frédéric Payan, Lauriane Bouard, Lénaïc Chizat (now at INRIA, ENS, PSL Research University Paris), Sébastien Schneider (now at HoloMake), Marc Antonini, Computational Geosciences, August 2019 [doi]+[link1]+[link2]+[bib]+[opus]+[postprint]+[arxiv]+[hal]+[animations]+[blog]+[patent+[linkedin]
With huge data acquisition progresses realized in the past decades and acquisition systems now able to produce high resolution grids and point clouds, the digitization of physical terrains becomes increasingly more precise. Such extreme quantities of generated and modeled data greatly impact computational performances on many levels of high-performance computing (HPC): storage media, memory requirements, transfer capability, and finally simulation interactivity, necessary to exploit this instance of big data. Efficient representations and storage are thus becoming "enabling technologies'' in HPC experimental and simulation science (Foster et al.}, Computing Just What You Need: Online Data Analysis and Reduction at Extreme Scales, 2017). We propose HexaShrink, an original decomposition scheme for structured hexahedral volume meshes. The latter are used for instance in biomedical engineering, materials science, or geosciences. HexaShrink provides a comprehensive framework allowing efficient mesh visualization and storage. Its exactly reversible multiresolution decomposition yields a hierarchy of meshes of increasing levels of details, in terms of either geometry, continuous or categorical properties of cells. Starting with an overview of volume meshes compression techniques, our contribution blends coherently different multiresolution wavelet schemes in different dimensions. It results in a global framework preserving discontinuities (faults) across scales, implemented as a fully reversible upscaling at different resolutions. Experimental results are provided on meshes of varying size and complexity. They emphasize the consistency of the proposed representation, in terms of visualization, attribute downsampling and distribution at different resolutions. Finally, HexaShrink yields gains in storage space when combined to lossless compression techniques.
Keywords: Compression, Corner point grid, Discrete wavelet transform, Geometrical discontinuities, Hexahedral volume meshes, High-Performance Computing (HPC), Multiscale methods, Simulation, Upscaling
9. BRANE Clust: cluster-assisted gene regulatory network inference refinement, Aurélie Pirayre, Camille Couprie, Laurent Duval, Jean-Christophe Pesquet, IEEE/ACM Transactions on Computational Biology and Bioinformatics, May-June 2018, Vol. 15, Number 3, pages 850-860 [code]+[doi]+[bib]+[preprint]+[biorxiv]+[researchgate]+[hal]+[pubmed]+[brane page]+[blog]+[rnaseq-blog]+[omic tools]+[wikipedia]
Discovering meaningful gene interactions is crucial for the identification of novel regulatory processes in cells. Building accurately the related graphs remains challenging due to the large number of possible solutions from available data. Nonetheless, enforcing a priori on the graph structure, such as modularity, may reduce network indeterminacy issues. BRANE Clust (Biologically-Related A priori Network Enhancement with Clustering) refines gene regulatory network (GRN) inference thanks to cluster information. It works as a post-processing tool for inference methods (i.e. CLR, GENIE3). In BRANE Clust, the clustering is based on the inversion of systems of linear equations involving a graph-Laplacian matrix promoting a modular structure. Our approach is validated on DREAM4 and DREAM5 datasets with objective measures, showing significant comparative improvements. We provide additional insights on the discovery of novel regulatory or co-expressed links in the inferred Escherichia coli network evaluated using the STRING database. The comparative pertinence of clustering is discussed computationally (SIMoNe, WGCNA, X-means) and biologically (RegulonDB). BRANE Clust software is available at: http://www-syscom.univ-mlv.fr/~pirayre/Codes-GRN-BRANE-clust.html.
Keywords: Optimization, Labeling, Biological information theory, Graphical models, Probabilistic logic, Merging, Databases, Gene regulatory network, clustering, combinatorial optimization, transcriptomic data, DREAM challenge
10. CHOPtrey: contextual online polynomial extrapolation for enhanced multi-core co-simulation of complex systems, Abir Ben Khaled-El Feki, Laurent Duval, Cyril Faure, Daniel Simon, Mongi Ben Gaid, Simulation: Transactions of the Society for Modeling and Simulation International, Mar. 2017, Vol. 93, number 3, p. 185-200 [doi]+[bib]+[supplement]+[hal]+[preprint]+[arxiv]+[researchgate]+[xmod]+[blog]+[page]
The growing complexity of Cyber-Physical Systems (CPS), together with increasingly available parallelism provided by multi-core chips, fosters the parallelization of simulation. Simulation speed-ups are expected from co-simulation and parallelization based on model splitting into weak-coupled sub-models, as for instance in the framework of Functional Mockup Interface (FMI). However, slackened synchronization between sub-models and their associated solvers running in parallel introduces integration errors, which must be kept inside acceptable bounds. CHOPtrey denotes a forecasting framework enhancing the performance of complex system co-simulation, with a trivalent articulation. First, we consider the framework of a Computationally Hasty Online Prediction system (CHOPred). It allows to improve the trade-off between integration speed-ups, needing large communication steps, and simulation precision, needing frequent updates for model inputs. Second, smoothed adaptive forward prediction improves co-simulation accuracy. It is obtained by past-weighted extrapolation based on Causal Hopping Oblivious Polynomials (CHOPoly). And third, signal behavior is segmented to handle the discontinuities of the exchanged signals: the segmentation is performed in a Contextual and Hierarchical Ontology of Patterns (CHOPatt). Implementation strategies and simulation results demonstrate the framework ability to adaptively relax data communication constraints beyond synchronization points which sensibly accelerate simulation. The CHOPtrey framework extends the range of applications of standard Lagrange-type methods, often deemed unstable. The embedding of predictions in lag-dependent smoothing and discontinuity handling demonstrates its practical efficiency.
Keywords: parallel simulation; Functional Mockup Interface (FMI); smoothed online prediction; causal polynomial extrapolation; context-based decision; internal combustion engine
11. BARCHAN: Blob Alignment for Robust CHromatographic ANalysis (GCxGC), Camille Couprie, Laurent Duval, Maxime Moreaud, Sophie Hénon, Mélinda Tebib, Vincent Souchon, Journal of Chromatography A, February 2017 (Virtual Special Issue RIVA 2016) [doi]+[bib]+[preprint]+[hal]+[arxiv]+[pdf]+[blog]+[pubmed]+[info-fr]+[info-en]
Comprehensive Two dimensional gas chromatography (GCxGC, or GC2D) plays a central role into the elucidation of complex samples. The automation of the identification of peak areas is of prime interest to obtain a fast and repeatable analysis of chromatograms. To determine the concentration of compounds or pseudo-compounds, templates of blobs are defined and superimposed on a reference chromatogram. The templates then need to be modified when different chromatograms are recorded. In this study, we present a chromatogram and template alignment method based on peak registration called BARCHAN. Peaks are identified using a robust mathematical morphology tool. The alignment is performed by a probabilistic estimation of a rigid transformation along the first dimension, and a non-rigid transformation in the second dimension, taking into account noise, outliers and missing peaks in a fully automated way. Resulting aligned chromatograms and masks are presented on two datasets. The proposed algorithm proves to be fast and reliable. It significantly reduces the time to results for GCxGC analysis.
12. BRANE Cut: Biologically-Related Apriori Network Enhancement with Graph cuts for Gene Regulatory Network inference, Aurélie Pirayre, Camille Couprie, Laurent Duval, Frédérique Bidard, Jean-Christophe Pesquet, BMC Bioinformatics, December 2015 [code]+[doi]+[bib]+[preprint]+[biorxiv]+[hal]+[brane page]+[blog]+[rnaseq-blog]+[omic tools]+[wikipedia]+[youtube]
Background Inferring gene networks from high-throughput data (RNA-Seq) constitutes an important step in the discovery of relevant regulatory relationships in organism cells. Despite the large number of available Gene Regulatory Network inference methods, the problem remains challenging: the underdetermination in the space of possible solutions requires additional constraints that incorporate a priori information on gene interactions.
Results Weighting all possible pairwise gene relationships by a probability of edge presence, we formulate the regulatory network inference as a discrete variational problem on graphs. We enforce biologically plausible coupling between groups and types of genes by minimizing an edge labeling functional coding for a priori structures. The optimization is carried out with Graph cuts, an approach popular in image processing and computer vision. We compare the inferred regulatory networks to results achieved by the mutual-information-based Context Likelihood of Relatedness (CLR) method and to the state-of-the-art GENIE3, winner of the DREAM4 multifactorial challenge.
Conclusions Our BRANE Cut approach infers more accurately the five DREAM4 in silico networks (with improvements from 6% to 11%). On a real Escherichia coli compendium, an improvement of 11.8% compared to CLR and 3% compared to GENIE3 is obtained in terms of Area Under Precision-Recall curve. Up to 48 additional verified interactions are obtained over GENIE3 for a given precision. On this dataset involving 4345 genes, our method achieves a performance similar to that of GENIE3, while being more than seven times faster. The BRANE Cut code is available at: http://www-syscom.univ-mlv.fr/~pirayre/Codes-GRN-BRANE-cut.html
Keywords: Gene network inference, high throughput data, optimization, network theory, maximum flow
13. Euclid in a Taxicab: Sparse Blind Deconvolution with Smoothed l_1/l_2 Regularization (or SOOT, for Smoothed One-Over-Two norm ratio), Audrey Repetti, Mai Quyen-Pham, Laurent Duval, Émilie Chouzenoux, Jean-Christophe Pesquet, IEEE Signal Processing Letters, May 2015, Volume 22, Number 5, pages 539-543 [doi]+[bib]+[preprint]+[page+arxiv]+[hal]+[blog]+[nuit blanche]+[slides]+[matlab toolbox]+[http://lc.cx/soot]+[wikipedia]
AbstractThe l1/l2 ratio regularization function has shown good performance for retrieving sparse signals in a number of recent works, in the context of blind deconvolution. Indeed, it benefits from a scale invariance property much desirable in the blind context. However, the l1/l2 function raises some difficulties when solving the nonconvex and nonsmooth minimization problems resulting from the use of such a penalty term in current restoration methods. In this paper, we propose a new penalty based on a smooth approximation to the l1/l2 function. In addition, we develop a proximal-based algorithm to solve variational problems involving this function and we derive theoretical convergence results. We demonstrate the effectiveness of our method through a comparison with a recent alternating optimization strategy dealing with the exact l1/l2 term, on an application to seismic data blind deconvolution (Ricker wavelet).
Keywords: Smoothed l_1/l_2 regularization, norm ratio, sparsity, blind deconvolution, nonconvex optimization, preconditioned forward-backward algorithm, seismic data processing

14. Chromatogram baseline estimation and denoising using sparsity (BEADS) (on background estimation or baseline removal for analytic chemistry signals), Xiaoran Ning, Ivan Selesnick, Laurent Duval, Chemometrics and Intelligent Laboratory Systems, p. 156-167, Volume 139, December 2014 [doi]+[bib]+[preprint]+[bib]+[page]+[matlabcentral]+[matlab code+ R code+ http://lc.cx/beads]+[blog]
This paper jointly addresses the problems of chromatogram baseline correction and noise reduction. The proposed approach is based on modeling the series of chromatogram peaks as sparse with sparse derivatives, and on modeling the baseline as a low-pass signal. A convex optimization problem is formulated so as to encapsulate these non-parametric models. To account for the positivity of chromatogram peaks, an asymmetric penalty functions is utilized. A robust, computationally efficient, iterative algorithm is developed that is guaranteed to converge to the unique optimal solution. The approach, termed Baseline Estimation And Denoising with Sparsity (BEADS), is evaluated and compared with two state-of-the-art methods using both simulated and real chromatogram data.
Keywords: baseline correction, baseline wander, baseline drift, sparse derivative, asymmetric penalty, low-pass filtering, convex optimization
15. A Primal-Dual Proximal Algorithm for Sparse Template-Based Adaptive Filtering: Application to Seismic Multiple Removal, Mai-Quyen Pham, Laurent Duval, Caroline Chaux, Jean-Christophe Pesquet, IEEE Transactions on Signal Processing, August 2014, Volume 62, Issue 16, pages 4256-4269 [page]+[arxiv]+[hal]+[doi]+[bib]+[blog]+[preprint]
Unveiling meaningful geophysical information from seismic data requires to deal with both random and structured "noises". As their amplitude may be greater than signals of interest (primaries), additional prior information is especially important in performing efficient signal separation. We address here the problem of multiple reflections, caused by wave-field bouncing between layers. Since only approximate models of these phenomena are available, we propose a flexible framework for time-varying adaptive filtering of seismic signals, using sparse representations, based on inaccurate templates. We recast the joint estimation of adaptive filters and primaries in a new convex variational formulation. This approach allows us to incorporate plausible knowledge about noise statistics, data sparsity and slow filter variation in parsimony-promoting wavelet frames. The designed primal-dual algorithm solves a constrained minimization problem that alleviates standard regularization issues in finding hyperparameters. The approach demonstrates significantly good performance in low signal-to-noise ratio conditions, both for simulated and real field seismic data.
Keywords: Convex optimization; Parallel algorithms; Wavelet transforms; Adaptive filters; Geophysical signal processing; Signal restoration; Sparsity; Signal separation.
16. Experimental hydrodynamic study of valve trays, R. Brahem and A. Royon-Lebeaud and D. Legendre and M. Moreaud and L. Duval, Chemical Engineering Science, Volume 100, 30 August 2013, Pages 23-32 [hal]+[doi]+[pdf]+[bib]
Valve trays column design for gas treatment still relies on empirical correlations developed on pilot units. The available correlations lead to large discrepancies and thus more experimental works are needed. The present hydrodynamic study was carried out on a Plexiglas 1.26 m per 0.1905 m absorption column containing four V-4 Glitsch valve trays. Water/air system was used at atmospheric pressure and ambient temperature. Liquid rate per weir unit length was varied between 3.2x10-3 m3.(m.s)-1 and 24.3x10-3 m3.(m.s)-1 and the kinetic gas factor between 0 and 3.5 Pa0.5. The following hydrodynamic parameters were determined: tray pressure drop, valves pressure drop, clear liquid height, mean emulsion height and liquid mean hold up on the tray. Correlations for clear liquid height, liquid mean hold up and emulsion height are proposed. Emulsion profiles characterisation was possible due to video records post-processing. Four different behaviours are identified for emulsion profiles according to liquid and gas velocities. Significant behaviour changes on the hydrodynamic parameters allowed the identification of three system limits: dumping, weeping and pre-flooding. Correlations are proposed for these limits and an operational diagram is presented.
Keywords: Absorption; Hydrodynamics; Multiphase flow; Scale up; Valve trays; Emulsion height profiles ; Process Engineering ;
17. Adaptive multiple subtraction with wavelet-based complex unary Wiener filters, Sergi Ventosa, Sylvain Le Roy, Irène Huard, Antonio Pica, Hérald Rabeson, Patrice Ricarte, Laurent Duval, Geophysics, number 77, vol. 183, p. 183-192, November-December 2012 [doi]+[preprint]+[arxiv]+[page]+[doi]+[ref]+[crossref]
Adaptive subtraction is a key element in predictive multiple-suppression methods. It minimizes misalignments and amplitude differences between modeled and actual multiples, and thus reduces multiple contamination in the dataset after subtraction. The challenge consists in attenuating multiples without distorting primaries, despite the high cross-correlation between their waveform. For this purpose, this complicated wide-band problem is decomposed into a set of more tractable narrow-band problems using a 1D complex wavelet frame. This decomposition enables a single-pass adaptive subtraction via single-sample (unary) complex Wiener filters, consistently estimated on overlapping windows in a complex wavelet transformed domain. Each unary filter compensates amplitude differences within its frequency support, and rectifies more robustly small and large misalignment errors through phase and integer delay corrections . This approach greatly simplifies the matching filter estimation and, despite its simplicity, compares promisingly with standard adaptive 2D methods, on both synthetic and field data.
Keywords: multiples, processing, filtering, cross-correlation, wavelet, adaptive, model
18. A Panorama on Multiscale Geometric Representations, Intertwining Spatial, Directional and Frequency Selectivity, on 2D wavelets, contourlets, curvelets, bandelets, steerlets, dual-tree, etc., Laurent Jacques, Laurent Duval, Caroline Chaux and Gabriel Peyré, Volume 91 Issue 12 Pages 2699-2730, Signal Processing, Dec. 2011, Special issue on Advances in Multirate Filter Bank Structures and Multiscale Representations [doi]+[arXxiv]+[hal]+[eprintweb]+[pdf]+[material]
The richness of natural images makes the quest for optimal representations in image processing and computer vision challenging. The latter observation has not prevented the design of image representations, which trade off between efficiency and complexity, while achieving accurate rendering of smooth regions as well as reproducing faithful contours and textures. The most recent ones, proposed in the past decade, share an hybrid heritage highlighting the multiscale and oriented nature of edges and patterns in images. This paper presents a panorama of the aforementioned literature on decompositions in multiscale, multi-orientation bases or dictionaries. They typically exhibit redundancy to improve sparsity in the transformed domain and sometimes its invariance with respect to simple geometric deformations (translation, rotation). Oriented multiscale dictionaries extend traditional wavelet processing and may offer rotation invariance. Highly redundant dictionaries require specific algorithms to simplify the search for an efficient (sparse) representation. We also discuss the extension of multiscale geometric decompositions to non-Euclidean domains such as the sphere or arbitrary meshed surfaces. The etymology of panorama suggests an overview, based on a choice of partially overlapping "pictures". We hope that this paper will contribute to the appreciation and apprehension of a stream of current research directions in image understanding.
Keywords: Review; Multiscale; Geometric representations; Oriented decompositions; Scale-space; Wavelets; Atoms; Sparsity; Redundancy; Bases; Frames; Edges; Textures; Image processing; Haar wavelet; Non-Euclidean wavelets
19. Pre-ignition in Highly Charged Spark Ignition Engines - Visualisation and Analysis, Jean-Marc Zaccardi, Matthieu Lecompte, Laurent Duval, Alexandre Pagot, MotorTechnische Zeitung, Dec. 2009 [doi]+[preprint]+[volume]+[link]
Highly boosted spark ignition engines must confront violent forms of pre-ignition limiting the maximal low-end torque. French research institute IFP presents here an innovative tool allowing a better understanding of this phe- nomenon and a structured reasoning considering all potential causes of this phenomenon. Advanced statistical analyses of the combustion process and direct visualisations inside the combustion chamber are successfully combined to accurately assess the development of pre-ignition. This coupled approach provides an efficient tool for analysis and development of new engines and new control concepts on IFP test beds.
Keywords: Combustion Analysis; Pre-ignition; Detection; Rumble; Knock
20. Optimization of Synthesis Oversampled Complex Filter Banks, Jérôme Gauthier, Laurent Duval and Jean-Christophe Pesquet, IEEE Transactions on Signal Processing, October 2009, Volume 57, Issue 10, p. 3827-3843 [doi]+[hal]+[arxiv]+[code]
An important issue with oversampled FIR analysis filter banks (FBs) is to determine inverse synthesis FBs, when they exist. Given any complex oversampled FIR analysis FB, we first provide an algorithm to determine whether there exists an inverse FIR synthesis system. We also provide a method to ensure the Hermitian symmetry property on the synthesis side, which is serviceable to processing real-valued signals. As an invertible analysis scheme corresponds to a redundant decomposition, there is no unique inverse FB. Given a particular solution, we parameterize the whole family of inverses through a null space projection. The resulting reduced parameter set simplifies design procedures, since the perfect reconstruction constrained optimization problem is recast as an unconstrained optimization problem. The design of optimized synthesis FBs based on time or frequency localization criteria is then investigated, using a simple yet efficient gradient algorithm.
Keywords: Filter design; frequency localization; inversion; lapped transforms; modulated filter banks; optimization; oversampled filter banks; time localization; inverse overlapped complex filter banks optimization; SURE denoising
21. A Nonlinear Stein Based Estimator for Multichannel Image Denoising, Caroline Chaux, Laurent Duval, Amel Benazza-Benyahia and Jean-Christophe Pesquet, IEEE Transactions on Signal Processing, August 2008, Volume 56, Issue 8, Part 2, p. 3855-3870 [doi]+[abstract]+[preprint]+[hal]+[arxiv]+[matlab toolbox]+[code]
The use of multicomponent images has become widespread with the improvement of multisensor systems having increased spatial and spectral resolutions. However, the observed images are often corrupted by an additive Gaussian noise. In this paper, we are interested in multichannel image denoising based on a multiscale representation of the images. A multivariate statistical approach is adopted to take into account both the spatial and the inter-component correlations existing between the different wavelet subbands. More precisely, we propose a new parametric nonlinear estimator which generalizes many reported denoising methods. The derivation of the optimal parameters is achieved by applying Stein's principle in the multivariate case. Experiments performed on multispectral remote sensing images clearly indicate that our method outperforms conventional wavelet denoising techniques
Keywords: M-band wavelet transform; Block estimator; Stein's principle; denoising; dual-tree wavelet transform; frames; multichannel noise; multicomponent image; multivariate estimation; nonlinear estimation ; complex wavelet thresholding
22. Noise covariance properties in Dual-Tree Wavelet Decompositions, Caroline Chaux, Jean-Christophe Pesquet, Laurent Duval, IEEE Transactions on Information Theory, December 2007, Volume 53, Issue 12, p. 4680--4700 [doi]+[arviv]+[preprint]+[code]
Dual-tree wavelet decompositions have recently gained much popularity, mainly due to their ability to provide an accurate directional analysis of images combined with a reduced redundancy. When the decomposition of a random process is performed -- which occurs in particular when an additive noise is corrupting the signal to be analyzed -- it is useful to characterize the statistical properties of the dual-tree wavelet coefficients of this process. As dual-tree decompositions constitute overcomplete frame expansions, correlation structures are introduced among the coefficients, even when a white noise is analyzed. In this paper, we show that it is possible to provide an accurate description of the covariance properties of the dual-tree coefficients of a wide-sense stationary process. The expressions of the (cross-)covariance sequences of the coefficients are derived in the one and two-dimensional cases. Asymptotic results are also provided, allowing to predict the behaviour of the second-order moments for large lag values or at coarse resolution. In addition, the cross-correlations between the primal and dual wavelets, which play a primary role in our theoretical analysis, are calculated for a number of classical wavelet families. Simulation results are finally provided to validate these results.
Keywords: Covariance; Hilbert transform cross-correlation; dependence; dual-tree; filter banks; frames; noise; random processes; stationarity; statistics; wavelets
23. Comprehensive Two-Dimensional Gas Chromatography for Detailed Characterisation of Petroleum Products Colombe Vendeuvre, R. Ruiz-Guerrero, Fabrice Bertoncini, Laurent Duval, Didier Thiébaut, Oil and Gas Science and Technology - Revue de l'IFP, Special issue on "Recent Advances in the Analysis of Catalysts and Petroleum Products", 2007, Vol. 62, n°01, p. 043-055 [doi]+[preprint]
Comprehensive two-dimensional gas chromatography (GCxGC or GC2D) is a major advance for the detailed characterisation of petroleum products. This technique is based on two orthogonal dimensions of separation achieved by two chromatographic capillary columns of different chemistries and selectivities. High-frequency sampling between the two columns is achieved by a modulator, ensuring that the whole sample is transferred and analysed continuously in both separations. Thus, the peak capacity and the resoluting power dramatically increase. Besides, highly structured 2D chromatograms are obtained upon the volatility and the polarity of the solute to provide more accurate molecular identification of hydrocarbons. In this paper fundamental and practical considerations for implementation of GCxGC are reviewed. Selected applications obtained using a prototype of a GCxGC chromatograph developed in-house highlight the potential of the technique for molecular characterisation of middle distillates, sulphur speciation in diesel and analysis of effluents from petrochemical processes
Keywords:
24. Image Analysis Using a Dual-Tree M-Band Wavelet Transform, Caroline Chaux, Laurent Duval, Jean-Christophe Pesquet, IEEE Transactions on Image Processing, August 2006, Volume 15, Issue 8, p. 2397-2412 [doi]+[arxiv]+[hal]+[code]+[ieee]+[citeseer]+[pubmed]
We propose a two-dimensional generalization to the M-band case of the dual-tree decomposition structure (initially proposed by Kingsbury and further investigated by Selesnick) based on a Hilbert pair of wavelets. We particularly address: 1) the construction of the dual basis and 2) the resulting directional analysis. We also revisit the necessary pre-processing stage in the M-band case. While several reconstructions are possible because of the redundancy of the representation, we propose a new optimal signal reconstruction technique, which minimizes potential estimation errors. The effectiveness of the proposed M-band decomposition is demonstrated via denoising comparisons on several image types (natural, texture, seismics), with various M-band wavelets and thresholding strategies. Significant improvements in terms of both overall noise reduction and direction preservation are observed.
Keywords: Direction selection; Hilbert transform; dual-tree; image denoising; wavelets
25. Characterization of middle-distillates by comprehensive two-dimensional gas chromatography (GCxGC): a powerful alternative for performing various standard analysis of middle distillates, Colombe Vendeuvre, Rosario Ruiz-Guerrero, Fabrice Bertoncini, Laurent Duval, Didier Thiébaut, Marie-Claire Hennion, Journal of Chromatography A, 1086 (2005) p. 21-28 [doi]+[pubmed]
The detailed characterisation of middle distillates is essential for a better understanding of reactions involved in refining process. Owing to higher resolution power and enhanced sensitivity, comprehensive two-dimensional gas chromatography (GCxGC) is a powerful tool for improving characterisation of petroleum samples. The aim of this paper is to compare GCxGC and various ASTM methods - gas chromatography (GC), liquid chromatography (LC) and mass spectrometry (MS) - for group type separation and detailed hydrocarbon analysis. Best features of GCxGC are demonstrated and compared to these techniques in terms of cost, time consumption and accuracy. In particular, a new approach of simulated distillation (SimDis-GCxGC) is proposed: compared to the standard method ASTM D2887 it gives unequal information for better understanding of conversion process.
Keywords: Comprehensive two-dimensional gas chromatography; ASTM methods; Simulated distillation; Group type separation; Hydrocarbons
26. Lapped transforms and hidden Markov models for seismic data filtering, Laurent Duval, Caroline Chaux, International Journal of Wavelets, Multiresolution and Information Processing (IJWMIP), Vol. 2, No. 4, Dec. 2004, p. 455-476, [doi]+[preprint]+[hal]
Seismic exploration provides information about the ground substructures. Seismic images are generally corrupted by several noise sources. Hence, efficient denoising procedures are required to improve the detection of essential geological information. Wavelet bases provide sparse representation for a wide class of signals and images. This property makes them good candidates for efficient filtering tools, allowing the separation of signal and noise coefficients. Recent works have improved their performance by modelling the intra- and inter-scale coefficient dependencies using hidden Markov models, since image features tend to cluster and persist in the wavelet domain. This work focuses on the use of lapped transforms associated with hidden Markov modelling. Lapped transforms are traditionally viewed as block-transforms, composed of M pass-band filters. Seismic data present oscillatory patterns and lapped transforms oscillatory bases have demonstrated good performances for seismic data compression. A dyadic like representation of lapped transform coefficient is possible, allowing a wavelet-like modelling of coefficients dependencies. We show that the proposed filtering algorithm often outperforms the wavelet performance both objectively (in terms of SNR) and subjectively: lapped transform better preserve the oscillatory features present in seismic data at low to moderate noise levels.
Keywords: seismic data filtering; lapped transforms; hidden Markov models
27. Comparison of conventional gas chromatography and comprehensive two-dimensional gas chromatography for the detailed analysis of petrochemical samples, Colombe Vendeuvre, Fabrice Bertoncini, Laurent Duval, Jean-Luc Duplan, Didier Thiébaut, Marie-Claire Hennion, Journal of Chromatography A, 1056, 2004, p. 155-162 [doi]+[preprint]+[pubmed]
Comprehensive two-dimensional gas chromatography (GCxGC) has been investigated for the characterization of high valuable petrochemical samples from dehydrogenation of n-paraffins, Fischer-Tropsch and oligomerization processes. GCxGC separations, performed using a dual-jets CO2 modulator, were optimized using a test mixture representative of the hydrocarbons found in petrochemicals. For complex samples, a comparison of GCxGC qualitative and quantitative results with conventional gas chromatography (1D-GC) has demonstrated an improved resolution power of major importance for the processes: the group type separation has permitted the detection of aromatic compounds in the products from dehydrogenation of n-paraffins and from oligomerization, and the separation of alcohols from other hydrocarbons in Fischer-Tropsch products.
Keywords: Gas chromatography, comprehensive two-dimensional; Petrochemical samples; Hydrocarbons

## Book chapters/Participations à ouvrages

1. Data processing applied to GCxGC. Applications to the petroleum industry - Traitement de données d'échantillons complexes en GCxGC (quantification, comparaison d'échantillons, post-traitement). Applications à l'industrie pétrolière et agro-alimentaire, Benoit Celse, F. Bertoncini, M. Moreaud, L. Duval, M. Courtiade, C. Dartiguelongue, S. Esnault, D. Cavagnino, in Gas chromatography and 2D-gas chromatography for petroleum industry: The race for selectivity, (Editors: Bertoncini Fabrice; Courtiade-Tholance Marion; Thiébaut Didier), 2013 [book+blog]
2. Guide de l'intelligence économique pour la recherche et présentation succincte (mars 2012, édité par la D2IE, cf. Guide d'Intelligence économique pour la recherche et Intelligence économique (et scientifique) pour la recherche).
3. Wavelet transform for the denoising of multivariate images, Caroline Chaux, Amel Benazza-Benyahia, Jean-Christophe Pesquet, Laurent Duval, in Multivariate Image Processing, Jocelyn Chanussot, Christophe Collet, Kacem Chehdi, editors, ISTE Ltd and John Wiley and Sons Inc, January 2010 [book+hal]
An increasing attention is being paid to multispectral images for a great number of applications (medicine, agriculture, archeology, forestry, coastal management, remote sensing) because many features of the underlying scene have unique spectral characteristics that become apparent in imagery when viewing combinations of its different components. [...] Despite the dramatical technological advances in terms of spatial and spectral resolutions of the radiometers, data still suffer from several degradations. For instance, the sensor limited aperture, aberrations inherent to optical systems and mechanical vibrations create a blur effect in remote sensing images [JAI 89]. In optical remote sensing imagery, there are also many noise sources. Firstly, the number of photons received by each sensor during the obturation time may fluctuate around its average implying a photon noise. A thermal noise may be caused by the electronics of the recording and the communication channels during the data downlinking. Intermittent saturations of any detector in a radiometer may give rise to an impulsive noise whereas a structured periodic noise is generally caused by interferences between electronic components. Detector striping (resp. banding) are consequences of calibration differences among individual scanning detectors (resp. from scan-to-scan). Besides, component-to-component misregistration may occur : corresponding pixels in different components are not systematically associated with the same position on the ground. As a result, it is mandatory to apply deblurring, denoising and geometric corrections to the degraded observations in order to fully exploit the information they contain. In this respect, it is used to distinguish between on-board and on-ground processing. Indeed, on-board procedures should simultaneously fulfill real-time constraints and low mass memory requirements. The involved acquisition bit-rates are high (especially for very high resolution missions) and hence, they complicate the software implementation of enhancement processing. [...]

## Communications à congrès/Conference papers

1. Refinable-precision in mesh compression for upscaling and upgridding in reservoir simulation with HexaShrink, Lauriane Bouard, Laurent Duval, Christophe Preux, Frédéric Payan, Marc Antonini, RING Meeting, September 2019, Nancy, France [site]
Efficiency is also the main concern of Laurianne Bouard (PhD student, IFPEN and Univ. Côte d’Azur) et al., who propose a multi-resolution scheme to compress corner-point hexahedral grids for flow applications.
2. Forme lissée de rapports de normes lp/lq (SPOQ) pour la reconstruction des signaux avec pénalisation parcimonieuse. Afef CHERNI, Emilie CHOUZENOUX, Laurent DUVAL, and Jean-Christophe PESQUET, colloque international francophone de Traitement du Signal et de l'Image GRETSI, 26-29 August 2019, Lille, France [opus+hal]
3. A Novel Smoothed Norm Ratio for Sparse Signal Restoration; Application to Mass Spectrometry, Afef CHERNI, Emilie CHOUZENOUX, Laurent DUVAL, and Jean-Christophe PESQUET, Signal Processing with Adaptive Sparse Structured Representations (SPARS) workshop, 1-4 July 2019, Toulouse, France [opus+hal]
4. Lossless progressive compression of meshes for upscaling and upgridding in reservoir simulation with HexaShrink, European Association of Geoscientists and Engineers, EAGE 2019, Dynamic Reservoir Characterization and Modelling II , June 2019, London, UK [opus]+[pdf]+[slides]+[picture]+[doi]+[link]+[session]+[tweet]
Manipulation of large data volumes is becoming a concern in the field of simulation. In reservoir simulation, grids are complex structures composed of heterogeneous objects, with a potentially large number of cells. HexaShrink is a perfectly reversible multiscale decomposition tool based on 3D wavelet transformation, dedicated to structured hexahedral meshes. It provides a representation of mesh geometry and properties at different resolutions, coherently preserving discontinuities such as fault networks. This lossless compression method was initially designed for storage and transfer. Here we evaluate its suitability in the context of two-phase flow simulations. Initial results demonstrate comparable impacts of low-resolution grids on injected water. This suggests that embedding low resolution meshes can serve accelerated appromixate simulation results with little overhead, in a consistent upscaling and upgriding fashion.
5. Robustifying a PLS property prediction workflow on NMR spectra with optimized processing, Laurent Duval, Marion Lacoue-Nègre, Aurélie Pirayre, Faraday discussions, Challenges in analysis of complex natural mixtures, May 2019 [hal]+abstract]+poster]
We revisit a validated PLS model for property prediction on NMR spectra. We investigate the benefits (precision, robustness, sensitivity) of integrating advanced optimization algorithms to usual least-squares in baseline/smoothing/ prediction.
6. Décomposition multi-échelles de maillages 3D hexaédriques dans le domaine des géosciences. Étude des performances en compression sans pertes (Multiscale decomposition of Hex meshes from Geosciences. Study of lossless compression performance). Lauriane Bouard, Laurent Duval, Frédéric Payan, Marc Antonini, colloque compression et représentation des signaux audiovisuels, CORESA 2018, 12-14 novembre 2018, Poitiers, France [hal]+[preprint]+[slides]+[opus]
Des méthodes de simulation employant des maillages sont mises en œuvre dans de nombreux champs scientifiques pour caractériser des phénomènes physiques sous-jacents. Les besoins croissants en précision induisent l'utilisation de maillages toujours plus volumineux, et entrainent des problèmes de visualisation, de manipulation ou de stockage des données. En géosciences, d'immenses zones géologiques sont modélisées par des maillages hexaédriques géométriquement complexes, émulant des propriétés physiques du sous-sol. En combinant différents types d'ondelettes classiques ou morphologiques qui préservent les discontinuités, HexaShrink (HS) apporte à cette problématique une solution multi-échelle cohérente. Dans cet article, nous analysons les performances de compression de HS, par l'étude exhaustive de sept maillages hétérogènes. Globalement, les taux de compression accrus, obtenus grâce à la décomposition, sont satisfaisants. Néanmoins, l'analyse distincte des différentes composantes définissant les maillages (géométrie, propriétés) révèle que l'utilisation d'encodeurs génériques n'est pas toujours optimale, ouvrant des perspectives vers des encodeurs multi-échelles plus à même d'exploiter les structures intrinsèques des niveaux de détail.
7. Arthur Marmin, Marc Castella, Jean-Christophe Pesquet, Laurent Duval, Signal Reconstruction from Sub-sampled and Nonlinearly Distorted Observations, EUSIPCO 2018, Rome, Italy, September 3-7, 2018
8. Jean Charléty, Laurent Duval, Yash Bhalgat, CatsEyes : classification de structures géologiques par représentation parcimonieuse en réseaux de trame d’ondelettes, issue des scattering wavelet networks (scatnet), 41th ISS France Day: Mathematical morphology and stereology (41ème journée ISS France, morphologie mathématique et stéréologie), February 1st, Paris, France
9. CATSEYES: Categorizing Seismic structures with tessellated scattering wavelet networks, with Yash Bhalgat (University of Michigan) and Jean Charléty (IFPEN), ICASSP 2018
As field seismic data sizes are dramatically increasing toward exabytes, automating the labeling of "structural monads" - corresponding to geological patterns and yielding subsurface interpretation - in a huge amount of available information would drastically reduce interpretation time. Since customary designed features may not account for the gradual deformation observable in seismic data, we propose to adapt the wavelet-based scattering network methodology with a tessellation of geophysical images. Its invariances are expected to be able to thwart the effect of the tectonics. The sparse structure of extracted feature vectors suggest to resort to dimension reduction methods before classification. The most promising one is based on a tessellated version of scattering decompositions, combined with a standard affine PCA classifier. Extensive comparative results on a four-class seismic database show the effectiveness of the proposed method in terms of seismic data labeling and object retrieval, in affordable computational time.
10. Décomposition progressive de maillages hexaédriques avec discontinuités : application aux modèles géologiques, GRETSI 2017
11. HOGMep: variational Bayes and higher-order graphical models applied to Joint Image Recovery and Segmentation, ICIP 2017, Sep. 17-20, 2017, Beijing, China [page]+hal]
12. Variational Bayesian approaches have been successfully applied to image segmentation. They usually rely on a Potts model for the hidden label variables and a Gaussian assumption on pixel intensities within a given class. Such models may however be limited, especially in the case of multicomponent images. We overcome this limitation with HOGMep, a Bayesian formulation based on a higher-order graphical model (HOGM) on labels and a Multivariate Exponential Power (MEP) prior for intensities in a class. Then, we develop an efficient statistical estimation method to solve the associated problem. Its flexibility accommodates to a broad range of applications, demonstrated on multicomponent image segmentation and restoration.
13. Efficient extrapolation for parallel co-simulation of coupled systems (CHOPtrey) (slides), IUTAM Symposium "Solver Coupling and Co-Simulation", September 18-20, 2017, Darmstadt, Germany
Building high-fidelity system-level models of Cyber-Physical Systems (CPS) is a challenging duty. A first problem is the diversity of modeling and simulation environments used by the various involved multi-disciplinary teams. Particular environments are preferred for a specific use, due to distinctive strengths (modeling language, libraries, solvers, cost, etc.). The Functional Mock-up Interface (FMI) specification has been proposed to improve this issue. A second problem is the growing complexity of such high-fidelity models and their induced prohibitive CPU execution time. Indeed, major system-level simulation softwares are relying on sequential ODE/DAE solvers. They are currently unable to efficiently exploit the available parallelism provided by multi-core chips. In addition, CPS are commonly modeled as hybrid models where the major challenge resides in their numerous discontinuities. Indeed, discontinuities usually prevent high integration speeds with variable-step solvers. We propose a modular co-simulation of a split model, where each sub-model is integrated with its own solver. Thanks to splitting, high integration speeds can be reached when using LSODAR (Livermore Solver for Ordinary Differential equations, with Automatic method switching for stiff and nonstiff problems, and with Root-finding),a variable step solver with a root-finding capability.% Indeed, interruption coming from unrelated events is avoided thanks to event's containment. Moreover events detection and location inside a sub-model can be processed faster because they involve a smaller set of variables. Nevertheless, partitioning a complex model into several lesser complex sub-models also brings some difficulties that need to be managed. First, partitioning may add virtual algebraic loops, therefore involving delayed outputs, even with an efficient execution order. To avoid the latter, we propose in~\cite{Benkhaled_A_Simpra} a new co-simulation method based on a refined scheduling approach. This technique, denoted RCosim'', retains the speed-up advantage of modular co-simulation thanks to the parallel execution of the sub-models. Furthermore, it improves the accuracy of simulation results through an offline scheduling of operations that takes care of model input/output dynamics. Second, partitioning and even co-simulation require synchronization between coupled models to exchange updated data to reduce numerical error propagation in simulation results. Thus, tight synchronization, using small communication steps, is required between blocks. This greatly limits the possibilities to accelerate the simulation. % and eventually reach the real-time. Adaptive communication steps may better handle changes in model dynamics. Meanwhile, stability of multi-rate simulators needs to be carefully assessed. Data extrapolation over steps is expected to enhance the precision over large communication steps. However, complex models usually present non-linearities and discontinuities, entangling forecasts from past observations only. We propose a Computationally Hasty Online Prediction framework (CHOPred) to stretch out synchronization steps with negligible precision changes in the simulation, at low-complexity. It allows to improve the trade-off between speed-ups, needing large communication steps, and precision, needing frequent updates for model inputs. It is based on a Contextual & Hierarchical Ontology of Patterns (CHOPatt) that handles the discontinuities of exchanged signals by selecting appropriate Causal Hopping Oblivious Polynomials (CHOPoly). CHOPtrey uses a time-depending oblivion faculty for polynomial extrapolation with an independent power weighting on past samples. It accounts for memory depth changes required to adapt to sudden variations. Computations are performed on frames of samples hopping at synchronization steps.%, allowing a low-complexity matrix formulation. It is implemented on xMOD tool in combination with model splitting and parallel simulation. It is applied on a hybrid dynamical engine model where the split parts are exported as FMUs from Dymola. Test results show effective simulation speed-up with imperceptible computational overheads. In addition, sustained or even improved simulation precision is obtained without noticeable instability.
14. HexaShrink: Multiresolution compression of large structured hexahedral meshes with discontinuities in geosciences (ICIP 2016)
15. Galerkin method using optimized wavelet-Gaussian mixed bases for electronic structure calculations in quantum chemistry, Laurent Duval, Luigi Genovese, Valérie Perrier, Quang Huy Tran, Dinh Huong Pham, 19th European Conference on Mathematics for Industry (ECMI 2016), Optimization Session, 13-17 June 2016, Santigo de Compostela, Spain
16. Efficient GCxGC data treatment with BARCHAN algorithm (Blob Alignment for Robust Chromatographic Analysis) and anchor points (13th GCxGC SYMPOSIUM 2016)
17. Sparse adaptive template matching and filtering for 2D seismic images with dual-tree wavelets and proximal methods, Mai-Quyen Pham, Caroline Chaux, Laurent Duval, Jean-Christophe Pesquet, IEEE International Conference on Image Processing, ICIP 2015 [poster]
18. Suppression de ligne de base et débruitage de chromatogrammes par pénalisation asymétrique de positivité et dérivées parcimonieuses, septembre 2015, GRETSI, colloque international francophone de Traitement du Signal et de l'Image, Lyon, France, avec la boîte à outils Matlab BEADS http://lc.cx/beads, avec le joli poster
19. Reconstruction de signaux parcimonieux à l'aide d'un algorithme rapide d'échantillonnage stochastique, septembre 2015, GRETSI, colloque international francophone de Traitement du Signal et de l'Image, Lyon, France
20. Deconvolution of seismic data with a regularized norm ratio, Audrey Repetti, Mai Quyen-Pham, Laurent Duval, Émilie Chouzenoux, Jean-Christophe Pesquet, ICIAM 2015, Eighth International Congress on Industrial and Applied Mathematics, "Sparsity-promoting seismic data analysis" minisymposium (program), with sparse deconvolution Matlab toolbox http://lc.cx/soot [abstract|slides]
21. Graph enhancement via clustering: application to Gene Regulatory Network inference, Aurélie Pirayre, Camille Couprie, Laurent Duval, Jean-Christophe Pesquet, One-day Workshop on Emerging Trends in Clustering, 2015, Orléans, France
22. BRANE Cut: Biologically-Related A priori Network Enhancement with Graph cuts for Gene Regulatory Network Inference, Aurélie Pirayre, Camille Couprie, Laurent Duval, Jean-Christophe Pesquet, ABS4NGS 2015 Workshop, Paris, France
23. Graph inference enhancement with clustering: application to gene regulatory network reconstruction, Aurélie Pirayre, Camille Couprie, Laurent Duval, Jean-Christophe Pesquet, EUSIPCO 2015, Nice, France
24. Bases mixtes ondelettes-gaussiennes pour le calcul de structures électroniques en chimie qunatique, Dinh Huong Pham, Valérie Perrier, Quang Huy Tran, Luigi Genovese, Laurent Duval, SMAI 2015, 7e Biennale Française des Mathématiques Appliquées et Industrielles, 9-12 juin 2015
25. Fast convex optimization for connectivity enforcement in gene regulatory network inference, Aurélie Pirayre, Camille Couprie, Laurent Duval, Jean-Christophe Pesquet, IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2015)
26. Euclid in a Taxicab: Sparse Blind Deconvolution with Smoothed l1/l2 Regularization, Audrey Repetti, Mai Quyen-Pham, Laurent Duval, Émilie Chouzenoux, Jean-Christophe Pesquet, IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2015) [DOI+pdf+page+arXiv+hal+blog+nuit blanche+poster+SOOT Matlab toolbox+http://lc.cx/soot]
27. Discrete vs Continuous Optimization for Gene Regulatory Network Inference, Aurélie Pirayre, Camille Couprie, Frédérique Bidard-Michelot, Laurent Duval, Jean-Christophe Pesquet, International Biomedical and Astronomical Signal Processing Workshop (BASP 2015)
28. Image Processing for Materials Characterization: Issues, Challenges and Opportunities, Laurent Duval, Maxime Moreaud, Camille Couprie, Dominique Jeulin, Hugues Talbot, Jesús Angulo, Special session on "Image Processing for Materials Characterization (blog information), International conference on Image Processing (ICIP), Oct.-Nov. 2014, Paris
This introductory paper aims at summarizing some problems and state-of-the-art techniques encountered in image processing for material analysis and design. Developing generic methods for this purpose is a complex task given the variability of the different image acquisition modalities (optical, scanning or transmission electron microscopy; surface analysis instrumentation, electron tomography, micro-tomography...), and material composition (porous, fibrous, granular, hard materials, membranes, surfaces and interfaces...). This paper presents an overview of techniques that have been and are currently developed to address this diversity of problems, such as segmentation, texture analysis, multiscale and directional features extraction, stochastic models and rendering, among others. Finally, it provides references to enter the issues, challenges and opportunities in materials characterization.
Keywords: Image Processing, Image-based Analysis, Materials science, Stochastic modeling, Surface Science, Texture Analysis, Work-flow
29. Incorporating structural/biological a priori in Gene Regulatory Networks Inference using Graph Cuts, Aurélie Pirayre, Camille Couprie, Frédérique Bidard-Michelot, Laurent Duval, Jean-Christophe Pesquet, ESCS 2014, Strasbourg, Second student poster prize (ISCB 2014 highlights)
30. Context-based polynomial extrapolation and slackened synchronization for fast multi-core simulation using FMI, Abir Ben Khaled, Mongi Ben Gaïd, Daniel Simon, Proceedings 10th International Modelica Conference, 10-12 March 20014, Lund, Sweden (preliminary program), [modelica+pdf+HAL]
31. Filtrage de multiples sismiques par ondelettes et optimisation convexe, Mai Quyen Pham, Caroline Chaux, Laurent Duval, Jean-Christophe Pesquet, Colloque GRETSI 2013, Brest, France [poster+HAL+page]
32. Seismic Multiple Removal with A Primal-Dual Proximal Algorithm, Mai Quyen Pham, Caroline Chaux, Laurent Duval, Jean-Christophe Pesquet, 38th International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2013), Vancouver, Canada [page+hal+ieee]
Both random and structured perturbations affect seismic data. Their removal, to unveil meaningful geophysical information, requires additional priors. Seismic multiples are one form of structured perturbations related to wave-field bouncing. In this paper, we model these undesired signals through a time-varying filtering process accounting for inaccuracies in amplitude, time-shift and average frequency of available templates. We recast the problem of jointly estimating the filters and the signal of interest (primary) in a new convex variational formulation, allowing the incorporation of knowledge about the noise statistics. By making some physically plausible assumptions about the slow time variations of the filters, and by adopting a potential promoting the sparsity of the primary in a wavelet frame, we design a primal-dual algorithm which yields good performance in the provided simulation examples.
33. An overview of signal processing issues in chemical sensing, Laurent Duval, Leonardo Duarte, Christian Jutten, [hal+ieee], 38th International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2013), Vancouver, Canada
This tutorial paper aims at summarizing some problems, ranging from analytical chemistry to novel chemical sensors, that can be addressed with classical or advanced methods of signal and image processing. We gather them under the denomination of "chemical sensing". It is meant to introduce the special session "Signal Processing for Chemical Sensing" with a large overview of issues which have been and remain to be addressed in this application domain, including chemical analysis leading to PARAFAC/tensor methods, hyper spectral imaging, ion-sensitive sensors, artificial nose, chromatography, mass spectrometry, etc. For enlarging and illustrating the points of view of this tutorial, the invited papers of the session consider other applications (NMR, Raman spectroscopy, recognition of explosive compounds, etc.) addressed by various methods, e.g. source separation, Bayesian, and exploiting typical chemical signal priors like positivity, linearity, unit-concentration or sparsity.
34. Unary adaptive subtraction of joint multiple models with complex wavelet frames, Sergi Ventosa, Sylvain Le Roy, Irène Huard, Antonio Pica, Hérald Rabeson, Laurent Duval, Proceedings of the SEG Annual Meeting, 2012 [poster+page]
Multiple attenuation is one of the greatest challenges in seismic processing. Due to the high cross-correlation between primaries and multiples, attenuating the latter without distorting the former is a complicated problem. We propose here a joint multiple model-based adaptive subtraction, using single-sample unary filters' estimation in a complex wavelet transformed domain. The method offers more robustness to incoherent noise through redundant decomposition. It is first tested on synthetic data, then applied on real-field data, with a single-model adaptation and a combination of several multiple models.
35. A Convex Variational Approach for Multiple Removal in Seismic Data, Diego Gragnaniello, Caroline Chaux, Jean-Christophe Pesquet and Laurent Duval, EUSIPCO 2012, August 27-31, Bucharest, Romania
Due to complex subsurface structure properties, seismic records often suffer from coherent noises such as multiples. These undesired signals may hide the signal of interest, thus raising difficulties in interpretation. We propose a new variational framework based on Maximum A Posteriori (MAP) estimation. More precisely, the problem of multiple removal is formulated as a minimization problem involving time-varying filters, assuming that a disturbance signal template is available and the target signal is sparse in some orthonormal basis. We show that estimating multiples is equivalent to identifying filters and we propose to employ recently proposed convex optimization procedures based on proximity operators to solve the problem. The performance of the proposed approach as well as its robustness to noise is demonstrated on realistically simulated data. [Keywords: convex optimization, wavelets, time-varying filters, regularization]
36. Coherent noise removal in seismic data with redundant multiscale directional filters, Sergi Ventosa, Hé:rald Rabeson, and Laurent Duval, EUSIPCO Conference, September 2011, Barcelona, Catalunya, Spain [abstract+page (figures)]
Directional filters are commonly used tools in modern seismic data processing to address coherent signals, depending on their apparent slowness or slope. This operation enhances the characterization of the great variety of signals present in a seismic dataset that enables a better characterization of the subsurface structure. This paper compares two complementary local adaptive multiscale directional filters: a directional filter bank based on dual-tree M-band wavelets and a novel local slant stack transform (LSST) based filter in the timescale domain. Their differences reside in redundancy levels and slope (directional) resolution. A structural similarity index measure has been employed to objectively compare both approaches on a real seismic dataset example.
37. Complex wavelet adaptive multiple subtraction with unary filters, Sergi Ventosa, Hé:rald Rabeson, Patrice Ricarte and Laurent Duval, EAGE Conference, June 2011, Vienna, Austria [paper+abstract+slides+page]
Multiple attenuation is a crucial task in seismic data processing because multiples usually cover primaries from fundamental reflectors. Predictive multiple suppression methods remove these multiples by building an adapted model, aiming at being subtracted from the original signal. However, before the subtraction is applied, a matching filter is required to minimize amplitude differences and misalignments between multiples and their prediction, and thus to minimize the multiples in the input dataset after the subtraction. In this paper we focus on the subtraction element. The proposed complex wavelet transform based approach simplifies the matching filter estimation.
38. Development of Specific Tools for Analysis and Quantification of Pre-ignition in a Boosted SI Engine, Jean-Marc Zaccardi, Matthieu Lecompte, Laurent Duval, Alexandre Pagot, 2009-01-1795, SAE Conference, June 2009
39. Two denoising SURE-LET methods for complex oversampled subband decompositions, Jérôme Gauthier, Laurent Duval, Jean-Christophe, European Signal Processing Conference, (EUSIPCO 2008), Lausanne, Switzerland, 25-29 August 2008.
Redundancy in wavelets and filter banks has the potential to greatly improve signal and image denoising. Having de- veloped a framework for optimized oversampled complex lapped transforms, we propose their association with the sta- tistically efficient Stein’s principle in the context of mean square error estimation. Under Gaussian noise assumptions, expectations involving the (unknown) original data are ex- pressed using the observation only. Two forms of Stein’s Un- biased Risk Estimators, derived in the coefficient and the spa- tial domain respectively, are proposed, the latter being more computationally expensive. These estimators are then em- ployed for denoising with linear combinations of elementary threshold functions. Their performances are compared to the oracle, and addressed with respect to the redundancy. They are finally tested against other denoising algorithms. They prove competitive, yielding especially good results for tex- ture preservation.
40. Coherent noise removal in seismic data with dual-tree M-band wavelets
Laurent Duval, Caroline Chaux, Stéphane Ker
SPIE Optics and Photonics – Wavelets XII, San Diego, 26-29 August 2007. (Wavelets XII)
Seismic data and their complexity still challenge signal processing algorithms in several applications. The advent of wavelet transforms has allowed improvements in tackling denoising problems. We propose here coherent noise filtering in seismic data with the dual-tree M-band wavelet transform. They offer the possibility to decompose data locally with improved multiscale directions and frequency bands. Denoising is performed in a deterministic fashion in the directional subbands, depending of the coherent noise properties. Preliminary results show that they consistently better preserve seismic signal of interest embedded in highly energetic directional noises than discrete critically sampled and redundant separable wavelet transforms.
41. Short-term spectral analysis and synthesis improvements with oversampled inverse filter banks, Jérome Gauthier, Laurent Duval
SPIE Optics and Photonics – Wavelets XII, San Diego, 26-29 August 2007 [conference+citeseer+codes]
The Short Term Fourier Transform (STFT) is a classical linear time-frequency (T-F) representation. Despites its relative simplicity, it has become a standard tool for the analysis of non-stationary signals. Since it provides a redundant representation, it raises some issues such as (i) \optimal" window choice for analysis, (ii) existence and determination of an inverse transformation, (iii) performance of analysis-modification-synthesis, or reconstruction of selected components of the time-frequency plane and (iv) redundancy controllability for low-cost applications, e.g. real-time computations. We address some of these issues, as well as the less often mentioned problem of transform symmetry in the inverse, through oversampled FBs and their optimized inverse(s) in a slightly more general setting than the discrete windowed Fourier transform.
42. Construction de bancs de filtres faiblement redondants, Jérome Gauthier, Laurent Duval, Jean-Christophe Pesquet, 21e colloque GRETSI sur le traitement du signal et des images, Troyes, France, September 2007 [conference]
43. Seismic data analysis with dual-tree M-band wavelet transforms, Laurent Duval, Caroline Chaux, and Stéphan Ker, 69th European Assoc. of Geophysicists and Engineers (EAGE 2007) Conference, London, UK, June 2007 [conference+pdf+earthdoc]
The complexity of seismic data still challenges signal processing algorithms in several applications. The rediscovery of wavelet transforms by J. Morlet et al. has allowed improvements in addressing several data representation (local analysis, compression) and restoration problems. However, despites their achievements, traditional approaches based on discrete and separable (both for computational purposes) wavelets fail at efficiently representing directional or higher dimensional data features, such as line or plane singularities, especially in severe noise conditions. Subsequent extensions to wavelets (multiscale pyramids, curvelets, contourlets, bandlets) have recently generated tremendous theoretical and practical interests. They feature local and multiscale properties associated with a certain amount of redundancy, which may represent an issue for huge datasets processing. We propose here seismic data processing based on dual-tree M-band wavelet transforms. They combine:
• orthogonal M-band filter banks which better separate frequency bands in seismic data than wavelets, due to the increased degrees of freedom in the filter design,
• Hilbert transform and complex representation of seismic signals which have been effective, especially for attributes definition, with a relatively low redundancy (a factor of two). These transforms have been successfully applied to random noise removal in traditional and remote sensing imagery. We apply them to seismic data and address their potential for local slope analysis and coherent noise (ground-roll) filtering.
44. Local directional and uncoherent seismic noise filtering with oversampled filter banks, Jérôme Gauthier, Marie-Christine Cacas, Laurent Duval, 69th European Assoc. of Geophysicists and Engineers (EAGE 2007) Conference, London, UK, June 2007 [>conference+earthdoc]
Seismic data are subject to different kinds of unwanted perturbations. These random or organised noises, which can be acquisition or processing related for instance, may disturb geophysical interpretations and thwart attempts at automated processing methods. Since the relative features (e.g. amplitude, spectrum) of the signals of interest and the noises may vary locally, signal and noise separation is obtained by a local data-driven filtering with two or three-dimensional oversampled complex filter banks. Filter banks in general decompose the noisy data onto frequency bands and directions on restricted subregions (sub-images or sub-volumes), acting like a local FK with improved properties. The transforms studied in this work present sub-regions smooth overlapping, to avoid tiling effects while allowing signal reconstruction from the transformed domain. The proposed methodology uses limited redundancy filtering that both yield enhanced noise robustness (due to oversampling) and tractable 2D or 3D processing, since they are optimized to limit the redundancy cost. Coupling those redundant transforms with a processing method designed to detect and compute locally dominant directions, and to remove unwanted directions and random noise, leads to good visual results. Tests were performed on 2D and 3D seismic data
45. Polychrom: a software for semi-automated quantitative gcxgc analysis with identification profiles, Benoît Celse, Fabrice Bertoncini, Frédérik Adam, Nicolas Brodusch, S´bastien Esnault, Laurent Duval, Dalian International Symposia and Exhibition on Chromatography (DISEC 2007), 4th GCxGC Symposium, Dalian, China, June 2007 (DISEC 2007)
46. Automatic template fit in comprehensive two dimensional gas chromatography
Benoît Celse, Fabrice Bertoncini, Laurent Duval, Frédérik Adam
Dalian International Symposia and Exhibition on Chromatography (DISEC 2007), 4th GCxGC Symposium, Dalian, China, June 2007 (DISEC 2007)
47. Comprehensive two dimensional gas chromatography image segmentation using the level-set technique
Laurent Duval, Yacine Bourkeb, Benoît Celse, Fabrice Bertoncini, Frédérik Adam, Rachid Deriche
Dalian International Symposia and Exhibition on Chromatography (DISEC 2007), 4th GCxGC Symposium, Dalian, China, June 2007 (DISEC 2007)
48. 2D dual-tree complex biorthogonal M-band wavelet transform, Caroline Chaux, Jean-Christophe Pesquet, Laurent Duval, International Conference on Acoutics, Speech and Signal Processing (ICASSP 2007), Honolulu, Hawaii, USA, April 2007 [conference 2007]
Dual-tree wavelet transforms have recently gained popularity since they provide low-redundancy directional analyses of images. In our recent work, dyadic real dual-tree decompositions have been extended to the M-band case, so adding much flexibility to this analysis tool. In this work, we propose to further extend this framework on two fronts by considering (i) biorthogonal and (ii) complex M-band dualtree decompositions. Denoising results are finally provided to demonstrate the validity of the proposed design rules. Keywords: Wavelet transforms, Hilbert transforms, Image analysis, Image processing, Gaussian noise.
49. Algorithm comparison for real time knock detection (pdf)
Stéphan Ker, Frédéric Bonnardot, Laurent Duval, International Conference on Acoutics, Speech and Signal Processing (ICASSP 2007), Honolulu, Hawaii, USA, April 2007 (ICASSP 2007)
This paper addresses the implementation and comparison of algorithms for real-time knock detection. Knock is an unwanted abnormal combustion process that may damage engines and limit their efficiency. For series vehicles, knock detection is generally obtained from knock sensors that capture other noise sources, thus requiring robust algorithms. In order to estimate the performance of time-frequency and Kalman filter based algorithms, a knock signal model is proposed and the algorithms are tested under various noise conditions. Experiments on modelled and real signals show the superiority of the recently developed S-method with respect to extended Kalman filtering. Keywords: Knock, Real time systems, Time-Frequency Analysis, Kalman filtering, Wigner distributions
50. Oversampled inverse complex lapped transform optimization, Jérôme Gauthier, Laurent Duval, Jean-Christophe Pesquet, International Conference on Acoutics, Speech and Signal Processing (ICASSP 2007), Honolulu, Hawaii, USA, April 2007 [conference]
When an oversampled FIR filter bank structure is used for signal analysis, a main problem is to guarantee its invertibility and to be able to determine an inverse synthesis filter bank. As the analysis scheme corresponds to a redundant decomposition, there is no unique inverse filter bank and some of the solutions can lead to artifacts in textured image filtering applications. In this paper, the flexibility in the choice of the inverse filter bank is exploited to find the best-localized impulse responses. The design is performed by solving a constrained optimization problem which is reformulated in a smaller dimensional space. Application to seismic data clearly shows the improvements brought by the optimization process. Keywords: FIR digital filters, Transforms, Redundancy, Optimization methods, Seismic signal processing
51. Real time knock detection with DFT-based time-frequency analysis, Stéphan Ker, Laurent Duval, Physics in Signal and Image Processing (PSIP 2007), Mulhouse, France, (January-February) janvier-février 2007 [conference]
52. Automatic blob fitting in comprehensive two-dimensional gas chromatography images, Benoît Celse, Frédérik Adam, Fabrice Bertoncini, Stéphane Bres, Laurent Duval, Physics in Signal and Image Processing (PSIP 2007), Mulhouse, France, (January-February) janvier-février 2007 [conference]
53. 3D seismic data processing with complex non separable multidimensional lapped transforms, Jérome Gauthier, Laurent Duval, Physics in Signal and Image Processing (PSIP 2007), Mulhouse, France, (January-February) janvier-février 2007 [conference]
54. Inversion de bancs de filtres M-bandes sur-échantillonnés
Jérome Gauthier, Laurent Duval
Journée nationale sur les logiciels de modélisation et de calcul scientifique (LMCS 2006), Paris-La Défense, France, November (novembre) 2006 [conference]
55. M-band filter banks and dual-tree wavelets for engine combustion and geophysical image analysis (invited paper), Laurent Duval, Caroline Chaux, Jean-Christophe Pesquet, Wavelet Applications in Industrial Processing IV on SPIE Symposium Optics East, Boston, MA, USA, October 2006 [conference+pdf]
Signals and images in industrial applications are often subject to strong disturbances and thus require robust methods for their analysis. Since these data are often non-stationary, time-scale or time-frequency tools have demonstrated effectiveness in their handling. More specifically, wavelet transforms and other filter bank generalizations are particularly suitable, due to their discrete implementation. We have recently investigated a specific family of filter banks, the M-band dual-tree wavelet, which provides state of the art performance for image restoration. It generalizes an Hilbert pair based decomposition structure, first proposed by N. Kingsbury and further investigated by I. Selesnick. In this work, we apply this frame decomposition to the analysis of two examples of signals and images in an industrial context: detection of structures and noises in geophysical images and the comparison of direct and indirect measurements resulting from engine combustion. Keywords: M-band wavelets, Hilbert transform, Dual-tree, Image denoising, Direction analysis.
56. Combustion diagnosis for internal combustion engines with real-time acquisition and processing
Stéphan Ker, Fabrice Guillemin, Laurent Duval
ISMA 2006, International Conference on Noise and Vibration Engineering, Leuven, Belgium, September 2006 [conference]
57. Low redundancy oversampled Lapped transforms and application to 3D seismic data
Jérôme Gauthier, Laurent Duval, and Jean-Christophe Pesquet
International Conference on Acoutics, Speech and Signal Processing (ICASSP 2006), Toulouse, France, May 2006 [conference]
58. A new estimator for image denoising using a 2D Dual-Tree M-band Wavelet Decomposition
Caroline Chaux, Laurent Duval, Amel Benazza-Benyahia and Jean-Christophe Pesquet
International Conference on Acoutics, Speech and Signal Processing (ICASSP), Toulouse, France, May 2006 (ICASSP 2006)
In a previous work, we proposed a relatively simple method to build non separable perfect reconstruction oversampled lapped transforms. The main drawback of this method was that the redundancy factor was constrained to be equal to the overlapping one. This constitutes a strong limitation for applications such as seismic processing involving three-dimensional data sets. The memory requirements may indeed become hard to meet if the redundancy is not reduced. In this paper, we propose an approach to guarantee that a given lapped transform is invertible by a finite length filter bank. We show how to compute a corresponding synthesis filter bank. The proposed analysis/ synthesis filter bank system is applied to directional filtering of noisy three-dimensional seismic data.
59. Surface wave attenuation in foothills areas: two innovatives approaches, Laurent Duval, Martine Ancel, Marc Becquey, Karine Broto, 75th International Meeting, Society of Exploration Geophysicists, SEG 2005, Houston, USA, November 2005 [conference]
Surface wave attenuation is a difficult problem to address in foothills areas due to the high variability of their typology. Such recorded events can indeed be non linear and even mimic reflection hyperbolas. We present two innovative approaches that exploit the combination of different criteria so as to discriminate be- tween effective signal and noise. The first one makes use of both apparent velocity and polarisation. Its ability to detect noise is demonstrated on a synthetic multi-component data set inspired from South American foothills. The second one combines apparent velocity and a decomposition on different time and space scales. Im- provements over classical FK filtering are shown on a sin- gle component real data set from Canadian foothills
60. Étude du bruit dans une analyse M-bandes en arbre dual, (pdf)
Caroline Chaux, Laurent Duval, Jean-Chistophe Pesquet
20e colloque GRETSI sur le traitement du signal et des images, Louvain, Belgique, September 2005 [pdf+conference+session]
In this work, we study the properties of an additive noise undergoing a dual-tree M-band wavelet analysis. We express the relationships governing noise coefficients both in the primal and the dual tree. The knowlegde of the noise statistical properties is particularly useful for the design of efficient denoising methods in the framework of a dual-tree wavelet analysis. Our main contribution consists in the computation of the resulting cross-correlation functions for several M-band wavelet families. More specifically, we show that pairwise coefficients, from the primal and the dual-tree resulting from a white noise decomposition, are uncorrelated. Yet, there exists a significant local correlation, whose extent depends on the choice of the wavelet pair.
61. A non-separable 2D complex modulated lapped transform and its applications to seismic data filtering, Jérôme Gauthier, Laurent Duval, Jean-Chistophe Pesquet, EUSIPCO 2005, European Signal Processing Conference, Antalya, Turkey, September 2005 [conference)
62. Seismic data analysis and filtering with dual-tree M-band wavelets
Laurent Duval, Caroline Chaux, Jean-Christophe Pesquet and Karine Broto, SIAM GS 2005, SIAM Conference on Mathematical and Computational Issues in the Geosciences, Avignon, France, June 2005 [conference]
63. Seismic data analysis with a dual-tree M-band wavelet transform
Caroline Chaux, Laurent Duval and Jean-Christophe Pesquet
67th European Assoc. of Geophysicists and Engineers (EAGE 2005) Conference, Madrid, Spain, June 2005 [conference]
64. 2D Dual-Tree M-band Wavelet Decomposition , Caroline Chaux (winner of the Student Paper Contest), Laurent Duval and Jean-Christophe Pesquet, International Conference on Acoutics, Speech and Signal Processing (ICASSP 2005), Philadelphia, USA, March 2005 [conference]
65. Hilbert Pairs of M-band Orthonormal Wavelet Bases, Caroline Chaux, Laurent Duval and Jean-Christophe Pesquet, >EUSIPCO 2004, Proc. European Signal and Image Processing Conference, Sept. 2004, pp. 1187-1190 [conference]
66. Evaluation of comprehensive two-dimensional gas chromatography (GCxGC) for the detailed characterization of complex hydrocarbon mixtures
Colombe Vendeuvre, Fabrice Bertoncini, Laurent Duval, S. Penet, F. Monot, Franck Haeseler, D. Thiébaut, M.-C. Hennion, ASMS 2004 27th International Symposium on Capillary Chromatography Riva del Garda, Italy, May-June 2004 [conference]
67. Noise reduction of images with multiple subband transforms, Toshihisa Tanaka, Laurent Duval, International Conference on Image Processing (ICIP 2004), Singapore, Oct. 2004 [conference]
68. Analysis of products issued from paraffins dehydrogenation by fast GC/MS, HRGC/MS and comprehensive GCxGC
Nancy Leymarie, Colombe Vendeuvre, Laurent Duval, Fabrice Bertoncini, Christophe Roullet, Frédérique Prigent, Nadine Bucher, Stéphane Morin
52nd American Society on Mass Spectrometry Conference on Mass Spectrometry and Allied Topics, Nashville, USA-TN, May 2004 (ASMS 2004)
69. Seismic filtering in a Lapped Transform domain with hidden Markov Models
Laurent Duval
66th European Assoc. of Geophysicists and Engineers (EAGE) Conference, Paris, France, June 2004 (EAGE 2004)
70. Hidden Markov tree image denoising with redundant Lapped transforms
Laurent Duval, Truong Nguyen
International Conference on Acoutics, Speech and Signal Processing (ICASSP), Montréal, Canada, May 2004 (ICASSP 2004)
71. Lapped transform domain denoising using hidden Markov trees
Laurent Duval, Truong Nguyen
International Conference on Image Processing (ICIP), Barcelona, Spain, Sept. 2003 (ICIP 2003)
72. Seismic data filtering with lapped transforms and hidden Markov models
Laurent Duval, Caroline Chaux
Wavelet and Statistics - Watering the seed, Villard de Lans, Grenoble, France, Sept. 4-7 (Grenoble 1994-2003)
73. Geophysical signal denoising using Multiple Wavelet Stacking
Laurent Duval
65th European Assoc. of Geophysicists and Engineers (EAGE) Conference, Stavanger, Norway, June 2003 (EAGE 2003)
74. Denoising of seismic signals with oversampled filter banks
Laurent Duval, Toshihisa Tanaka
International Conference on Acoustics, Speech and Signal Processing (ICASSP), Hong-Kong, China, Apr. 2003
75. Practical aspects of 3D coherent noise filtering using (F-Kx-Ky) or wavelet transform filters
Galibert, P.-Y., Laurent Duval, Dupont, R.
72th International Meeting, Society of Exploration Geophysicists (SEG), Salt Lake City, United States, Oct. 2002
76. Noise robust spark engine knock detection with redundant wavelet transform
Laurent Duval, Gilles Corde, Van Bui-Tran, Pierre Leduc
International Seminar on Modal Analysis, Noise and vibration engineering (ISMA), Leuven, Belgium Sep. 2002, [issue]
77. Efficient coherent noise filtering: an application of shift-invariant wavelet denoising [slides]
Laurent Duval, Galibert, P.-Y.
64th European Assoc. of Geophysicists and Engineers (EAGE) Conference, Florence, Italy, May 2002
78. Simultaneous seismic compression and denoising using a lapped transform coder
Laurent Duval
International Conference on Acoustics, Speech and Signal Processing (ICASSP), Orlando, USA, May 2002
79. Compression denoising: using seismic compression for uncoherent noise removal
Laurent Duval and Van Bui-Tran
63rd European Assoc. of Geophysicists and Engineers (EAGE) Conference, Amsterdam, Netherlands, June 2001
80. Seismic data compression using GULLOTS
Laurent Duval and Takayuki Nagai
International Conference on Acoustics, Speech and Signal Processing (ICASSP), Salt Lake City, USA, May 2001
81. Filter bank decomposition of seismic data with application to compression and denoising
(2000-2001 SEG Student Sections Award of merit)
Laurent Duval, Tage Røsten
70th International Meeting, Society of Exploration Geophysicists (SEG), p. 2055-2058, Calgary, Canada, August 2000
82. GenLOT optimization techniques for seismic data compression
Laurent Duval and Van Bui-Tran and Nguyen, T. Q. and Tran, T. D.
International Conference on Acoustics, Speech and Signal Processing (ICASSP), p. 2111-2114, Istanbul, Turkey, June 2000
83. Seismic data compression using GenLOT: towards "optimality"?
Laurent Duval and Van Bui-Tran and Nguyen, T. Q. and Tran, T. D.
Data Compression Conference (DCC), p. 552, Snowbird, UT, March 2000
84. A new class of filter banks for seismic data compression
Laurent Duval and Oksman, J. and Nguyen, T. Q.
Proc. 69th International Meeting, Society of Exploration Geophysicists (SEG), p. 1907-1910, Houston, TX, Oct. 1999
85. Compression de données sismiques par ondelettes et GenLOT
Laurent Duval, Van Bui-Tran
Réunion des théoriciens des circuits de langue française (RTCLF), p. 23-24, Metz, France, Oct. 1999
86. Seismic data compression: a comparative study between GenLOT and wavelet compression
Laurent Duval, Nguyen, T. Q.
SPIE Vol. 3813, Wavelet Applications in Signal and Image Processing VII, p. 802-810, M. Unser and A. Aldroubi and A. F. Laine, editors
Denver, CO, July 1999
87. Seismic data compression and QC (Quality Control) using GenLOT
Laurent Duval and Nguyen, T. Q. and Tran, T. D.,
61st European Assoc. of Geophysicists and Engineers Conference (EAGE), Helsinki, Finland, June 1999
88. On Progressive Seismic Data Compression using GenLOT
Laurent Duval, Nguyen, T. Q., Tran, T. D.
Proc. 33rd Conf. on Information Sciences and Systems (CISS), p. 956-959, Baltimore, MA, March 1999

## Distinctions/Awards (by ricochet)

Second poster presentation award to Aurélie Pirayre et al., "Incorporating structural a priori in Gene Regulatory Network Inference using Graph cuts" [Poster, Awarded 2nd best poster], European Student Council Symposium 2014 (ESCS'14), Strasbourg, France

Best student paper award to Caroline Chaux et al., ICASSP 2005

2000-2001 SEG Student Sections Award of merit with Tage Røsten, NTNU (now Statoil)

## Co-authors and collaborators: forthcoming, present and past enjoyable collaborations

Copyright: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Copyright notice for IEEE publications: (c)20xx IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.