# Publications using catch22

On this page, we keep track of scientific research publications that use *catch22*. Articles are labeled as follows:

* 📗 = Journal is **open access**.
* :closed\_book: = Journal is ***not*****&#x20;open access**.
* 📙 = **Freely** accessible pre-print.
* 💻 = **Code/Data available**.

If you have used *catch22* in your published work, please **contact us by** [**email**](mailto:ben.d.fulcher@gmail.com) and we'll add it this growing list!

***

## 🖥️ Algorithms <a href="#algs" id="algs"></a>

*catch22* features have formed the basis of new algorithms:

<table data-view="cards" data-full-width="false"><thead><tr><th align="center"></th><th data-hidden data-card-cover data-type="files"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center">Combine the Matrix Profile with catch22 features, as C22MP, yielding a state-of-the-art anomaly detector.<br><span data-gb-custom-inline data-tag="emoji" data-code="1f4d5">📕</span><a href="https://doi.org/10.1007/s10115-024-02107-5"> <em>Knowl Inf Sys (2024)</em><mark style="color:red;">.</mark></a></td><td><a href="/files/BE5NYxX0QAPDUOKgPm7G">/files/BE5NYxX0QAPDUOKgPm7G</a></td><td><a href="https://doi.org/10.1007/s10115-024-02107-5">https://doi.org/10.1007/s10115-024-02107-5</a></td></tr><tr><td align="center"><p>For developing <strong>semi-synthetic time series</strong> to understand algorithm performance.</p><p>📙 <a href="https://arxiv.org/abs/2312.01344"><em><mark style="color:orange;">arXiv (2023</mark></em><mark style="color:orange;">)</mark></a><mark style="color:orange;">.</mark></p></td><td><a href="/files/59hn2Tmo8m0QVoYX2juV">/files/59hn2Tmo8m0QVoYX2juV</a></td><td><a href="https://arxiv.org/abs/2312.01344">https://arxiv.org/abs/2312.01344</a></td></tr><tr><td align="center"><p>A new method to <strong>reduce the complexity</strong> of feature-based explanations.</p><p>📗 <a href="https://doi.org/10.1016/j.inffus.2023.101955"><em><mark style="color:green;">Information Fusion (2023)</mark></em></a>.</p></td><td><a href="/files/CxnBKdcLURaQEfGyBEUr">/files/CxnBKdcLURaQEfGyBEUr</a></td><td><a href="https://www.sciencedirect.com/science/article/pii/S1566253523002713?via%3Dihub">https://www.sciencedirect.com/science/article/pii/S1566253523002713?via%3Dihub</a></td></tr><tr><td align="center"><p>The <strong>canonical interval time-series classifier</strong>.</p><p>📗 <a href="https://ieeexplore.ieee.org/document/9378424"><em><mark style="color:green;">IEEE International Conference on Big Data</mark></em> <em><mark style="color:green;">(2020)</mark></em><mark style="color:green;">.</mark></a></p></td><td><a href="/files/O091RUR4mE0UhhLZpmvo">/files/O091RUR4mE0UhhLZpmvo</a></td><td><a href="https://doi.org/10.1109/BigData50022.2020.9378424">https://doi.org/10.1109/BigData50022.2020.9378424</a></td></tr></tbody></table>

* "COCALITE: A Hybrid Model COmbining CAtch22 and LITE for Time Series Classification". :closed\_book:[*<mark style="color:red;">Badi et al. IEEE International Conference on Big Data</mark>* *<mark style="color:red;">(2024)</mark>*<mark style="color:red;">.</mark> ](https://ieeexplore.ieee.org/document/10825872)
* A toolbox for predictive analytics (feature extraction and selection) demonstrated on wind turbine bearing condition classification, hydraulic systems condition monitoring, CNC tool wear estimation, and respiratory health monitoring. :closed\_book:[ *<mark style="color:red;">Zhou et al. Annual Symposium on Reliability and Maintainability (2025)</mark>*](https://doi.org/10.1109/RAMS48127.2025.10935234)<mark style="color:red;">.</mark>

***

## Applications&#x20;

*catch22* features have been used in applications to:

### ⌚ Physiological and other sensors

<table data-view="cards"><thead><tr><th align="center"></th><th data-hidden data-card-cover data-type="files"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center">Distinguish <strong>pain responses</strong> to hot and cold stimuli from electrodermal Activity and electromyography.<br> <a href="https://doi.org/10.21203/rs.3.rs-4469036/v1">📙 <em><mark style="color:orange;">Ozek et al., Research Square</mark></em><mark style="color:orange;"> (2024).</mark></a></td><td><a href="/files/oNkelw4qxxlFYHcaPv1f">/files/oNkelw4qxxlFYHcaPv1f</a></td><td><a href="https://www.researchsquare.com/article/rs-4469036/v1">https://www.researchsquare.com/article/rs-4469036/v1</a></td></tr><tr><td align="center"><p>Classify hyperkinetic, tonic, and tonic-clonic <strong>seizures</strong> using unsupervised clustering of video signals.</p><p><a href="https://www.frontiersin.org/articles/10.3389/fneur.2023.1270482/full">📗 <em><mark style="color:green;">Frontiers in Neurology</mark></em><mark style="color:green;"> (2023)</mark></a><mark style="color:green;">.</mark> </p></td><td><a href="/files/ydoJvVsEpXKAojgQMaKH">/files/ydoJvVsEpXKAojgQMaKH</a></td><td><a href="https://www.frontiersin.org/journals/neurology/articles/10.3389/fneur.2023.1270482/full">https://www.frontiersin.org/journals/neurology/articles/10.3389/fneur.2023.1270482/full</a></td></tr><tr><td align="center">Estimate <strong>pain intensity</strong> from physiological sensors.<a href="https://arxiv.org/abs/2311.08569">📙 <em><mark style="color:orange;">arXiv</mark></em><mark style="color:orange;"> (2023)</mark></a><mark style="color:orange;">.</mark></td><td><a href="/files/6KS2OXH4eGN9gUFpejSx">/files/6KS2OXH4eGN9gUFpejSx</a></td><td><a href="https://arxiv.org/abs/2311.08569">https://arxiv.org/abs/2311.08569</a></td></tr><tr><td align="center"><p>Classify <strong>human exercises</strong> using wearable sensors and video data.</p><p> <span data-gb-custom-inline data-tag="emoji" data-code="1f4d5">📕</span><a href="https://doi.org/10.1007/978-3-031-43427-3_19"> <em><mark style="color:red;">Joint European Conference on Machine Learning and Knowledge Discovery in Databases (2023)</mark></em></a><em><mark style="color:red;">.</mark></em></p></td><td><a href="/files/47X8XJO7shkgYJ94kRF4">/files/47X8XJO7shkgYJ94kRF4</a></td><td><a href="https://link.springer.com/chapter/10.1007/978-3-031-43427-3_19">https://link.springer.com/chapter/10.1007/978-3-031-43427-3_19</a></td></tr><tr><td align="center"><p>Classify <strong>advertising engagement</strong> using affect and physiological signals of heart rate, electrodermal activity, pupil dilation, and skin temperature.</p><p> <a href="https://doi.org/10.3390/s23156916">📗 <em><mark style="color:green;">Sensors</mark></em><mark style="color:green;"> (2023)</mark></a><mark style="color:green;">.</mark></p></td><td><a href="/files/RxNWlKbCrhbdLi4iSgPe">/files/RxNWlKbCrhbdLi4iSgPe</a></td><td><a href="https://www.mdpi.com/1424-8220/23/15/6916">https://www.mdpi.com/1424-8220/23/15/6916</a></td></tr><tr><td align="center"><p>Determine the <strong>type of breathing</strong> using wireless sensors with a motion capture system.</p><p> <a href="https://doi.org/10.3390/a16050249">📗 <em><mark style="color:green;">Algorithms</mark></em><mark style="color:green;"> (2023)</mark></a><mark style="color:green;">.</mark></p></td><td><a href="/files/KzWWoqrznqCetH5nFj9W">/files/KzWWoqrznqCetH5nFj9W</a></td><td><a href="https://www.mdpi.com/1999-4893/16/5/249">https://www.mdpi.com/1999-4893/16/5/249</a></td></tr><tr><td align="center"><p>Automate <strong>general movements assessment</strong> for <strong>Cerebral Palsy</strong> from smartphone videos.</p><p>📗<a href="https://journals.plos.org/digitalhealth/article?id=10.1371/journal.pdig.0000432"><em><mark style="color:green;">PLOS Digital Health (2024).</mark></em></a></p><p> <a href="https://www.medrxiv.org/content/10.1101/2023.04.03.23288092">📙 <em><mark style="color:orange;">medRxiv</mark></em><mark style="color:orange;"> (2023)</mark></a><mark style="color:orange;">.</mark></p><p>💻 <a href="https://github.com/epassmore/infant-movements"><em>Code</em></a><em>.</em></p></td><td><a href="/files/GD5PQ2jSnmHXkuS41wtN">/files/GD5PQ2jSnmHXkuS41wtN</a></td><td><a href="https://www.medrxiv.org/content/10.1101/2023.04.03.23288092v1">https://www.medrxiv.org/content/10.1101/2023.04.03.23288092v1</a></td></tr><tr><td align="center"><p><strong>Detect stress levels</strong> in real time from multiple physiological signals (heart rate, blood pressure, electrodermal activity, and respiration).</p><p> <a href="https://ieeexplore.ieee.org/abstract/document/10063853">📗 <em><mark style="color:green;">IEEE Access</mark></em><mark style="color:green;"> (2023).</mark></a></p></td><td><a href="/files/CFUVpazzrLhViLV3tMhh">/files/CFUVpazzrLhViLV3tMhh</a></td><td><a href="https://ieeexplore.ieee.org/abstract/document/10063853">https://ieeexplore.ieee.org/abstract/document/10063853</a></td></tr><tr><td align="center"><p>Comparison to ShapAAL method for <strong>sensor</strong> time-series classification.</p><p> <a href="https://doi.org/10.1371/journal.pone.0277975">📗 <em><mark style="color:green;">PLOS ONE</mark></em><mark style="color:green;"> (2022)</mark></a><mark style="color:green;">.</mark></p></td><td><a href="/files/PtEdnAIhShvbYd57xLgt">/files/PtEdnAIhShvbYd57xLgt</a></td><td><a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0277975">https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0277975</a></td></tr><tr><td align="center"><p>Extract <strong>markers of cardiometabolic disease</strong> <strong>risk</strong> from wearable device recordings.</p><p> <a href="https://doi.org/10.2196/34669">📗 <em><mark style="color:green;">Journal of Medical Internet Research</mark></em><mark style="color:green;"> (2022)</mark></a><mark style="color:green;">.</mark></p></td><td><a href="/files/jnToV0Pgxt7XZduKGXBh">/files/jnToV0Pgxt7XZduKGXBh</a></td><td><a href="https://www.jmir.org/2022/7/e34669">https://www.jmir.org/2022/7/e34669</a></td></tr><tr><td align="center">Predict <strong>behavioural change</strong> from physiological signals. <a href="https://doi.org/10.3390/s22093468">📗 <em><mark style="color:green;">Sensors</mark></em><mark style="color:green;"> (2022)</mark></a><mark style="color:green;">.</mark></td><td><a href="/files/Kd0zmzPu4iHuPz7RqARw">/files/Kd0zmzPu4iHuPz7RqARw</a></td><td><a href="https://www.mdpi.com/1424-8220/22/9/3468#">https://www.mdpi.com/1424-8220/22/9/3468#</a></td></tr><tr><td align="center"><p>Estimate <strong>objective pain intensity</strong> using physiological sensors, paving the way for developing a wearable pain measurement device.</p><p> <a href="https://doi.org/10.1371/journal.pone.0254108">📗 <em><mark style="color:green;">PLOS ONE</mark></em><mark style="color:green;"> (2021)</mark></a><mark style="color:green;">.</mark></p></td><td><a href="/files/hoaNgKK0mx9WxsX0J5H3">/files/hoaNgKK0mx9WxsX0J5H3</a></td><td><a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0254108">https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0254108</a></td></tr></tbody></table>

***

### 🧬 Biology, physiology, neuroscience, pathology, and ecology

<table data-view="cards"><thead><tr><th align="center"></th><th data-hidden data-card-cover data-type="files"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center">Predict pathological complete response in <strong>breast cancer</strong> from dynamic contrast-enhanced <strong>magnetic resonance images</strong>.<br> 📗 <a href="https://breast-cancer-research.biomedcentral.com/articles/10.1186/s13058-024-01836-3"><em><mark style="color:green;">Breast Cancer Research</mark></em><mark style="color:green;"> (2024).</mark></a></td><td><a href="/files/q9rOTqTuzQxG3rWqhBIH">/files/q9rOTqTuzQxG3rWqhBIH</a></td><td><a href="https://breast-cancer-research.biomedcentral.com/articles/10.1186/s13058-024-01836-3">https://breast-cancer-research.biomedcentral.com/articles/10.1186/s13058-024-01836-3</a></td></tr><tr><td align="center"><p>Predict species and age of <strong>malarial mosquitoes</strong> using mid-infrared spectroscopy.</p><p> <a href="https://doi.org/10.5334/dsj-2024-025">📗 <em><mark style="color:green;">Data Science Journal</mark></em><mark style="color:green;"> (2024).</mark></a><br></p></td><td><a href="/files/uu5LyRxj3IaIprwLTMXN">/files/uu5LyRxj3IaIprwLTMXN</a></td><td><a href="https://doi.org/10.5334/dsj-2024-025">https://doi.org/10.5334/dsj-2024-025</a></td></tr><tr><td align="center"><p>Classify <strong>calf behaviour</strong> from accelerometer signals.</p><p> 📙 <a href="https://arxiv.org/abs/2404.18159"><em><mark style="color:orange;">arXiv (2024)</mark></em></a><em><mark style="color:orange;">.</mark></em></p></td><td><a href="/files/OOOfMw9F98JNZ3A9fXCX">/files/OOOfMw9F98JNZ3A9fXCX</a></td><td><a href="https://arxiv.org/abs/2404.18159">https://arxiv.org/abs/2404.18159</a></td></tr><tr><td align="center"><p>Detect early signals of <strong>disease outbreaks</strong> from incidence data. </p><p>📗<a href="https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1012782"><em><mark style="color:green;">PLOS Computational Biology (2025).</mark></em></a></p><p> 📙<a href="https://arxiv.org/abs/2404.08893"><em><mark style="color:orange;">arXiv</mark></em><mark style="color:orange;"> (2024)</mark></a><mark style="color:orange;">.</mark></p></td><td><a href="/files/QPbHRTh0JtKsxoJuxJaM">/files/QPbHRTh0JtKsxoJuxJaM</a></td><td><a href="https://arxiv.org/abs/2404.08893">https://arxiv.org/abs/2404.08893</a></td></tr><tr><td align="center"><p>Predict the incidence trends of <strong>infectious diseases</strong>.</p><p> 📗 <a href="https://doi.org/10.1186/s12859-023-05621-5"><em><mark style="color:green;">BMC Bioinformatics</mark></em> <em><mark style="color:green;">(2024)</mark></em></a><em><mark style="color:green;">.</mark></em></p></td><td><a href="/files/Zqk5I5JtSb5svL2QLNGh">/files/Zqk5I5JtSb5svL2QLNGh</a></td><td><a href="https://bmcbioinformatics.biomedcentral.com/articles/10.1186/s12859-023-05621-5">https://bmcbioinformatics.biomedcentral.com/articles/10.1186/s12859-023-05621-5</a></td></tr><tr><td align="center"><p>Distinguish <strong>chemical stimuli</strong> from <em>C. elegans</em> chemosensory system recordings. </p><p>📙 <a href="https://www.biorxiv.org/content/10.1101/2024.01.14.575365v1"><em><mark style="color:orange;">bioRxiv (2024).</mark></em></a></p></td><td><a href="/files/cjKxw5fwYkGGqtg4PlxT">/files/cjKxw5fwYkGGqtg4PlxT</a></td><td><a href="https://www.biorxiv.org/content/10.1101/2024.01.14.575365v1">https://www.biorxiv.org/content/10.1101/2024.01.14.575365v1</a></td></tr><tr><td align="center"><p>Predict when an individual patient can <strong>switch from IV to oral antibiotic treatment</strong> from routinely collected clinical parameters from over 10,000 ICU stays.</p><p> <a href="https://doi.org/10.1038/s41467-024-44740-2">📗 <em><mark style="color:green;">Nature Comms.</mark></em> <em><mark style="color:green;">(2024)</mark></em></a><em><mark style="color:green;">.</mark></em></p></td><td><a href="/files/FvPjjg21kBT1ZADvxOOX">/files/FvPjjg21kBT1ZADvxOOX</a></td><td><a href="https://www.nature.com/articles/s41467-024-44740-2">https://www.nature.com/articles/s41467-024-44740-2</a></td></tr><tr><td align="center"><p><strong>Track </strong><em><strong>Drosophila</strong></em> in real time for high-throughput behavioural phenotyping.</p><p> <a href="https://elifesciences.org/articles/86695">📗 <em><mark style="color:green;">eLife</mark></em><mark style="color:green;"> (2023)</mark></a><mark style="color:green;">.</mark></p></td><td><a href="/files/hJ8eT5XyH3snWCB6gU4C">/files/hJ8eT5XyH3snWCB6gU4C</a></td><td><a href="https://elifesciences.org/articles/86695">https://elifesciences.org/articles/86695</a></td></tr><tr><td align="center"><p>Predict of <strong>cardiomyocytes differentiation outcome</strong> from oxygen consumption rate time series from human-induced <strong>pluripotent stem cells</strong>.</p><p> <span data-gb-custom-inline data-tag="emoji" data-code="1f4d5">📕</span><a href="https://doi.org/10.1002/bit.28489"> <em><mark style="color:red;">Biotechnology and Bioengineering (2023)</mark></em></a><em><mark style="color:red;">.</mark></em></p></td><td><a href="/files/rn2oTkorG2qbFAvn8h1B">/files/rn2oTkorG2qbFAvn8h1B</a></td><td><a href="https://analyticalsciencejournals.onlinelibrary.wiley.com/doi/10.1002/bit.28489">https://analyticalsciencejournals.onlinelibrary.wiley.com/doi/10.1002/bit.28489</a></td></tr><tr><td align="center"><p>Detect dynamic electrical signatures of <strong>human breast cancer cells</strong> from voltage imaging.</p><p> <a href="https://doi.org/10.1038/s42003-022-04077-2">📗 <em><mark style="color:green;">Comms. Bio. (2022)</mark></em></a><em><mark style="color:green;">.</mark></em></p></td><td><a href="/files/Z27GwP22DLn2Kd2h8wAG">/files/Z27GwP22DLn2Kd2h8wAG</a></td><td><a href="https://www.nature.com/articles/s42003-022-04077-2">https://www.nature.com/articles/s42003-022-04077-2</a></td></tr><tr><td align="center"><p>Screen for <strong>COVID-19</strong> using holographic microscopy reconstructed <strong>red blood cells</strong>.</p><p> <a href="https://doi.org/10.1364/BOE.466005">📗 <em><mark style="color:green;">Biomedical Optics Express (2022)</mark></em></a><em><mark style="color:green;">.</mark></em></p></td><td><a href="/files/9ijUe9lACdJLZRulknTB">/files/9ijUe9lACdJLZRulknTB</a></td><td><a href="https://opg.optica.org/boe/fulltext.cfm?uri=boe-13-10-5377&#x26;id=502632">https://opg.optica.org/boe/fulltext.cfm?uri=boe-13-10-5377&#x26;id=502632</a></td></tr><tr><td align="center"><p>Evaluate similarity of synthetically generated <strong>peripheral nerve signals</strong>.</p><p> <span data-gb-custom-inline data-tag="emoji" data-code="1f4d5">📕</span> <a href="https://ieeexplore.ieee.org/abstract/document/9441284"><em><mark style="color:red;">10th International IEEE/EMBS Conference on Neural Engineering (NER) (2021)</mark></em></a><em><mark style="color:red;">.</mark></em></p></td><td><a href="/files/6FqF5mznZ74kmVDfKc5l">/files/6FqF5mznZ74kmVDfKc5l</a></td><td><a href="https://ieeexplore.ieee.org/abstract/document/9441284">https://ieeexplore.ieee.org/abstract/document/9441284</a></td></tr><tr><td align="center"><p>Understand influences on the dynamics of <strong>tree motion</strong>.</p><p> <a href="https://doi.org/10.5194/bg-18-4059-2021">📗 <em><mark style="color:green;">Biogeosciences (2021)</mark></em></a><em><mark style="color:green;">.</mark></em></p></td><td><a href="/files/0UrcaAO7hZnanweRKEvd">/files/0UrcaAO7hZnanweRKEvd</a></td><td><a href="https://bg.copernicus.org/articles/18/4059/2021/">https://bg.copernicus.org/articles/18/4059/2021/</a></td></tr><tr><td align="center"><br></td><td></td><td></td></tr></tbody></table>

* To quantify neural coding from MEG time series. [📙 <mark style="color:orange;">Maleki & Karimi-Rouzbahani.</mark> <mark style="color:orange;"></mark>*<mark style="color:orange;">bioRxiv</mark>* <mark style="color:orange;"></mark><mark style="color:orange;">(2025)</mark>](https://www.biorxiv.org/content/10.1101/2025.04.30.651376)<mark style="color:orange;">.</mark>
* To classify peptides from blockage current time series measured from a nanopore-based device. :closed\_book:[<mark style="color:red;">Hoßbach et al.,</mark> <mark style="color:red;"></mark>*<mark style="color:red;">The Journal of Chemical Physics</mark>* <mark style="color:red;"></mark><mark style="color:red;">(2025).</mark>](https://doi.org/10.1063/5.0250399) :orange\_book: [*<mark style="color:orange;">arXiv (2024)</mark>*](https://arxiv.org/abs/2408.14275)*<mark style="color:orange;">.</mark>*
* As a baseline for classifying marmoset monkey calls from audio. :green\_book: [<mark style="color:green;">Interspeech 2024 satellite event (2024)</mark>](https://vihar-2024.vihar.org/assets/VIHAR_2024_proceedings.pdf)<mark style="color:green;">.</mark>📙 [<mark style="color:orange;">Sarkar et al.</mark> <mark style="color:orange;"></mark>*<mark style="color:orange;">arXiv</mark>* <mark style="color:orange;"></mark><mark style="color:orange;">(2024).</mark>](https://arxiv.org/abs/2407.16417)
* To identify methamphetamine users from EEG recordings.  📗 [<mark style="color:green;">Meynaghizadeh-Zargar et al.,</mark> <mark style="color:green;"></mark>*<mark style="color:green;">Biogeosciences</mark>* <mark style="color:green;"></mark><mark style="color:green;">(2024)</mark>](https://biomedrb.com/Article/brb-4041)<mark style="color:green;">.</mark>

***

### 🏭 Industry, energy and chemistry

<table data-view="cards" data-full-width="false"><thead><tr><th align="center"></th><th data-hidden data-card-cover data-type="files"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center"><p>Detect <strong>edge flag faults</strong> in <strong>wind turbines</strong>.</p><p> 📗 <a href="https://wes.copernicus.org/articles/9/181/2024/"><em><mark style="color:green;">Wind Energy Sci. Discuss.</mark></em> <em><mark style="color:green;">(2023)</mark></em></a><em><mark style="color:green;">.</mark></em></p></td><td><a href="/files/GCPuAcr47uj7AVQqpTss">/files/GCPuAcr47uj7AVQqpTss</a></td><td><a href="https://wes.copernicus.org/articles/9/181/2024/">https://wes.copernicus.org/articles/9/181/2024/</a></td></tr><tr><td align="center"><p>Compare the influence of imputation strategies for classifying <strong>household devices from electricity usage</strong>.</p><p> <span data-gb-custom-inline data-tag="emoji" data-code="1f4d5">📕</span><a href="https://ieeexplore.ieee.org/document/10375473"><em><mark style="color:red;">2023 Tenth International Conference on Social Networks Analysis, Management and Security (SNAMS).</mark></em></a></p></td><td><a href="/files/04sSIFyL8eWGVakw6BoI">/files/04sSIFyL8eWGVakw6BoI</a></td><td><a href="https://doi.org/10.1109/SNAMS60348.2023.10375473">https://doi.org/10.1109/SNAMS60348.2023.10375473</a></td></tr><tr><td align="center"><p>Detect <strong>chemical analytes</strong> from chemiresistive hardware sensor arrays.</p><p> 📙 <a href="https://arxiv.org/abs/2312.09871"><em><mark style="color:orange;">arXiv (2023).</mark></em></a></p></td><td><a href="/files/m9PGt6AMmmL3KfICXSev">/files/m9PGt6AMmmL3KfICXSev</a></td><td><a href="https://arxiv.org/abs/2312.09871">https://arxiv.org/abs/2312.09871</a></td></tr><tr><td align="center"><p>Detect <strong>fraud</strong> from smart meters. </p><p>📗 <a href="https://www.researchgate.net/publication/376641713_Predictive_Fraud_Detection_An_Intelligent_Method_for_Internet_of_Smart_Grid_Things_Systems"><em><mark style="color:green;">Journal of Internet Services and Applications (2023).</mark></em></a></p></td><td><a href="/files/POOLRqaOYer4L84gPk3r">/files/POOLRqaOYer4L84gPk3r</a></td><td><a href="https://www.researchgate.net/publication/376641713_Predictive_Fraud_Detection_An_Intelligent_Method_for_Internet_of_Smart_Grid_Things_Systems">https://www.researchgate.net/publication/376641713_Predictive_Fraud_Detection_An_Intelligent_Method_for_Internet_of_Smart_Grid_Things_Systems</a></td></tr><tr><td align="center"><p><strong>Appliance detection</strong> from very low-frequency smart meter time series.</p><p><span data-gb-custom-inline data-tag="emoji" data-code="1f4d5">📕</span><a href="https://dl.acm.org/doi/abs/10.1145/3575813.3595198#core-cited-by"><mark style="color:red;">ACM International Conference on Future Energy Systems (2023).</mark></a></p><p><a href="https://www.researchgate.net/publication/370654257_Appliance_Detection_Using_Very_Low-Frequency_Smart_Meter_Time_Series">📙 </a><a href="https://www.researchgate.net/publication/370654257_Appliance_Detection_Using_Very_Low-Frequency_Smart_Meter_Time_Series"><em><mark style="color:orange;">ResearchGate (2023).</mark></em></a></p></td><td><a href="/files/VFALu7ZIgufa76LcekRH">/files/VFALu7ZIgufa76LcekRH</a></td><td><a href="https://www.researchgate.net/profile/Adrien-Petralia-2/publication/370654257_Appliance_Detection_Using_Very_Low-Frequency_Smart_Meter_Time_Series/links/645c0b48f43b8a29ba40dd25/Appliance-Detection-Using-Very-Low-Frequency-Smart-Meter-Time-Series.pdf">https://www.researchgate.net/profile/Adrien-Petralia-2/publication/370654257_Appliance_Detection_Using_Very_Low-Frequency_Smart_Meter_Time_Series/links/645c0b48f43b8a29ba40dd25/Appliance-Detection-Using-Very-Low-Frequency-Smart-Meter-Time-Series.pdf</a></td></tr><tr><td align="center"><p>Track <strong>on-board diagnostics</strong> for monitoring engine operations.</p><p>📗 <a href="https://www.scitepress.org/Link.aspx?doi=10.5220/0011036400003191"><em><mark style="color:green;">Proceedings of the 8th International Conference on Vehicle Technology and Intelligent Transport Systems - VEHITS</mark></em> <em><mark style="color:green;">(2022).</mark></em></a></p></td><td><a href="/files/qvc8mDBsnffKr2kXbqtM">/files/qvc8mDBsnffKr2kXbqtM</a></td><td><a href="https://doi.org/10.5220/0011036400003191">https://doi.org/10.5220/0011036400003191</a></td></tr><tr><td align="center"><p>Detect <strong>anomalies in cloud services</strong>, where using <em>catch22</em> resulted in the highest performance. </p><p>📗 <a href="https://www.mdpi.com/2624-831X/3/1/8"><em><mark style="color:green;">IoT</mark></em><mark style="color:green;"> (2022).</mark></a></p></td><td><a href="/files/oQekLFM805gvDw2ixusP">/files/oQekLFM805gvDw2ixusP</a></td><td><a href="https://www.mdpi.com/2624-831X/3/1/8">https://www.mdpi.com/2624-831X/3/1/8</a></td></tr><tr><td align="center"><p>Predict <strong>anode effects in aluminium production</strong> at least 1 min of advance from TRIMET Aluminium SE Essen (TAE) time-series data.</p><p>📗 <a href="https://www.mdpi.com/2076-3417/10/24/9050"><em><mark style="color:green;">Applied Sciences</mark></em><mark style="color:green;"> (2021).</mark></a></p></td><td><a href="/files/vslIOSM9ZyqWCONgSBpC">/files/vslIOSM9ZyqWCONgSBpC</a></td><td><a href="https://www.mdpi.com/2076-3417/10/24/9050">https://www.mdpi.com/2076-3417/10/24/9050</a></td></tr><tr><td align="center">Classify vehicle trajectories from unmanned aerial vehicle-derived trajectories. <mark style="color:green;">📗</mark><a href="https://www.mdpi.com/2220-9964/13/8/264"><mark style="color:green;">Teo et al. </mark><em><mark style="color:green;">ISPRS Int. J. Geo-Inf.</mark></em><mark style="color:green;"> (2024).</mark></a></td><td><a href="/files/0Wm0nsvzxOpaCQLk1mUH">/files/0Wm0nsvzxOpaCQLk1mUH</a></td><td><a href="https://www.mdpi.com/2220-9964/13/8/264">https://www.mdpi.com/2220-9964/13/8/264</a></td></tr></tbody></table>

**Others**:

* Audio classification. [<mark style="color:orange;">📙 Marzano et al. (2025)</mark> <mark style="color:orange;"></mark>*<mark style="color:orange;">arXiv</mark>*](https://arxiv.org/abs/2503.17018).
* Hard disk drive failure prediction.  [<mark style="color:green;">📗 Li et al. (2025)</mark> <mark style="color:green;"></mark>*<mark style="color:green;">Engineering Applications of Artificial Intelligence</mark>*<mark style="color:green;">.</mark>](https://doi.org/10.1016/j.engappai.2025.110674)
* Predicting different types of induction motor failure. :closed\_book: [*<mark style="color:red;">Li et al., IEEE Annual Symposium on Reliability and Maintainability (2025).</mark>*](https://ieeexplore.ieee.org/document/10935284)
* Analyzing appliance consumption by automatically detecting and classifying appliance activations in industrial kitchens. 📗 [*<mark style="color:green;">Martins et al. Computers and Electrical Engineering</mark>* <mark style="color:green;"></mark><mark style="color:green;">(2025).</mark>](https://doi.org/10.1016/j.compeleceng.2025.110163)
* Sound Analysis of Drop Characteristics by Evaluation of Impact on Water Pool. Arogeti et al. :closed\_book:[*<mark style="color:red;">IEEE 2024 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (2024)</mark>*](https://doi.org/10.23919/SPA61993.2024.10715636)*<mark style="color:red;">.</mark>*&#x20;

***

### 🧠 Neuroimaging

<table data-view="cards" data-full-width="false"><thead><tr><th align="center"></th><th data-hidden data-card-cover data-type="files"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center"><p><strong>EEG classification</strong> using AutoML.</p><p>📙 <a href="https://repositorio.usp.br/directbitstream/0a2656f3-54e2-449d-ba1a-1db95ca007d3/3142491.pdf"><em><mark style="color:orange;">Anais do Symposium on Knowledge Discovery, Mining and Learning (KDMiLe) (2023).</mark></em></a></p></td><td><a href="/files/pZdyb97cg0MLpQy3hQGa">/files/pZdyb97cg0MLpQy3hQGa</a></td><td><a href="https://repositorio.usp.br/directbitstream/0a2656f3-54e2-449d-ba1a-1db95ca007d3/3142491.pdf">https://repositorio.usp.br/directbitstream/0a2656f3-54e2-449d-ba1a-1db95ca007d3/3142491.pdf</a></td></tr><tr><td align="center"><p>Infer rules underlying experience of <strong>subjective liking of artwork</strong> stimuli from EEG.</p><p>📗 <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0287513"><em><mark style="color:green;">PLOS ONE (2023)</mark></em>. </a></p></td><td><a href="/files/xyh8EsldJR5vfw1w5CdI">/files/xyh8EsldJR5vfw1w5CdI</a></td><td><a href="https://doi.org/10.1371/journal.pone.0287513">https://doi.org/10.1371/journal.pone.0287513</a></td></tr></tbody></table>

***

* Classify Multiple Sclerosis from full-field visual evoked potentials. <mark style="color:green;">📗</mark> [<mark style="color:green;">Banijamali et al.,</mark> <mark style="color:green;"></mark>*<mark style="color:green;">Doc Ophthalmol</mark>* <mark style="color:green;"></mark><mark style="color:green;">(2024).</mark>](https://link.springer.com/article/10.1007/s10633-024-09980-z)
* Investigate the neural correlates of dreaming. <mark style="color:green;">📗</mark> [<mark style="color:green;">Wong et al. PsyArXiv (2025).</mark>](https://osf.io/preprints/psyarxiv/69e43_v1)

### 🔭 Meteorology and astronomy

<table data-view="cards" data-full-width="false"><thead><tr><th align="center"></th><th data-hidden data-card-cover data-type="files"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center"><p>Generate long-term <strong>air temperature forecasts</strong> as part of explainable AI models. </p><p>📗 <a href="https://www.sciencedirect.com/science/article/pii/S0169809523000054?via%3Dihub"><em><mark style="color:green;">Atmospheric Research (2023).</mark></em></a></p></td><td><a href="/files/YvklpxnfzLavjRWBtXkC">/files/YvklpxnfzLavjRWBtXkC</a></td><td><a href="https://doi.org/10.1016/j.atmosres.2023.106608">https://doi.org/10.1016/j.atmosres.2023.106608</a></td></tr><tr><td align="center"><p>Detect <strong>earthquakes</strong>.</p><p>📙 <a href="https://arxiv.org/abs/2205.00525"><em><mark style="color:orange;">arXiv (2022).</mark></em></a></p></td><td><a href="/files/MdBruiJq1HLiCgHiwxq0">/files/MdBruiJq1HLiCgHiwxq0</a></td><td><a href="https://doi.org/10.48550/arXiv.2205.00525">https://doi.org/10.48550/arXiv.2205.00525</a></td></tr><tr><td align="center"><p><strong>Classify exoplanets</strong> from light curves.</p><p>📙 <a href="https://uu.diva-portal.org/smash/record.jsf?pid=diva2%3A1325376&#x26;dswid=5668"><em><mark style="color:orange;">Student dissertation (2019).</mark></em></a></p></td><td><a href="/files/Es2GOAvy5HSen2OcMDHw">/files/Es2GOAvy5HSen2OcMDHw</a></td><td><a href="http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-385690">http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-385690</a></td></tr></tbody></table>

* Predict solar flares from solar active region magnetic field data. [📙 *<mark style="color:orange;">arXiv (2024)</mark>*](https://arxiv.org/abs/2410.00312)*<mark style="color:orange;">.</mark>*

***

### 🏛️ Finance and databases

<table data-view="cards" data-full-width="false"><thead><tr><th align="center"></th><th data-hidden data-card-cover data-type="files"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center"><p>As part of a <strong>meta-learning</strong> strategy for predicting <strong>market price</strong> movement.</p><p>📙 <a href="https://repositorio.usp.br/directbitstream/bd56a114-4b4c-49b7-a179-554e1d1fc099/3142482.pdf"><em><mark style="color:orange;">Anais do Symposium on Knowledge Discovery, Mining and Learning (KDMiLe) (2023)</mark></em></a>.</p></td><td><a href="/files/fckVDudjMzvgnJB4bYsf">/files/fckVDudjMzvgnJB4bYsf</a></td><td><a href="https://repositorio.usp.br/directbitstream/bd56a114-4b4c-49b7-a179-554e1d1fc099/3142482.pdf">https://repositorio.usp.br/directbitstream/bd56a114-4b4c-49b7-a179-554e1d1fc099/3142482.pdf</a></td></tr><tr><td align="center"><p>Representations for time series in <strong>database management systems</strong>, such as Apache IoTDB, InfluxDB, OpenTSDB. </p><p>📙 <a href="https://link.springer.com/article/10.1007/s00778-024-00840-5"><em><mark style="color:red;">The VLDB Journal (2024)</mark></em>.</a></p></td><td><a href="/files/BT83LiVVjkB3UhnFgD7Y">/files/BT83LiVVjkB3UhnFgD7Y</a></td><td><a href="https://link.springer.com/article/10.1007/s00778-024-00840-5">https://link.springer.com/article/10.1007/s00778-024-00840-5</a></td></tr><tr><td align="center"><p>Analyze US <strong>market price</strong> data. </p><p>📙 <a href="https://arxiv.org/abs/2303.16117"><em><mark style="color:orange;">arXiv</mark></em><mark style="color:orange;"> (2023)</mark>.</a></p></td><td><a href="/files/7ki4NvnxIJbeDoz3j8Gt">/files/7ki4NvnxIJbeDoz3j8Gt</a></td><td><a href="https://arxiv.org/abs/2303.16117">https://arxiv.org/abs/2303.16117</a></td></tr><tr><td align="center"><p>Capture meaningful properties of <strong>financial time series</strong> (performance is lower than using domain features).</p><p>📙 <a href="https://econpapers.repec.org/paper/iseremwps/wp01852021.htm"><em><mark style="color:orange;">Working Papers REM</mark></em> <em><mark style="color:orange;">(2021).</mark></em></a></p></td><td><a href="/files/hhcYJjsaPkHVe0769i8n">/files/hhcYJjsaPkHVe0769i8n</a></td><td><a href="https://econpapers.repec.org/paper/iseremwps/wp01852021.htm">https://econpapers.repec.org/paper/iseremwps/wp01852021.htm</a></td></tr><tr><td align="center"><br>Distinguish texts generated by AI (LLMs) from text authored by humans.<br> 📙<em><mark style="color:orange;">Proceedings of the 19th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2024).</mark></em></td><td><a href="/files/wxXL23sDnlU8vrDia8zx">/files/wxXL23sDnlU8vrDia8zx</a></td><td></td></tr></tbody></table>

***

### 🌍 General

<table data-view="cards" data-full-width="false"><thead><tr><th align="center"></th><th data-hidden data-card-cover data-type="files"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center"><p>To improve the efficiency and accuracy of <strong>time-series forecasting</strong> on target datasets using transfer learning.</p><p>📙 <a href="https://arxiv.org/abs/2404.06198"><em><mark style="color:orange;">arXiv</mark></em><mark style="color:orange;"> (2024)</mark></a>.</p></td><td><a href="/files/HdT8HisrUMh8PpzhDAAi">/files/HdT8HisrUMh8PpzhDAAi</a></td><td><a href="https://arxiv.org/abs/2404.06198">https://arxiv.org/abs/2404.06198</a></td></tr><tr><td align="center"><p>Compare and <strong>cluster</strong> long time series.</p><p>📗 <a href="https://ieeexplore.ieee.org/document/10030022/"><em><mark style="color:green;">IEEE International Conference on Knowledge Graph (ICKG)</mark></em><mark style="color:green;"> (2022)</mark>.</a></p></td><td><a href="/files/41JZjt1sFIqsmCSa068V">/files/41JZjt1sFIqsmCSa068V</a></td><td><a href="https://doi.org/10.1109/ICKG55886.2022.00013">https://doi.org/10.1109/ICKG55886.2022.00013</a></td></tr><tr><td align="center"><p>To understand dataset differences in evaluating foundation models for probabilistic <strong>time-series forecasting</strong>.</p><p><a href="https://arxiv.org/abs/2310.08278">📙 <em><mark style="color:orange;">arXiv</mark></em><mark style="color:orange;"> (2023)</mark></a><mark style="color:orange;">.</mark></p></td><td><a href="/files/sFDQ1jGR1QlxlL7WqGJx">/files/sFDQ1jGR1QlxlL7WqGJx</a></td><td><a href="https://arxiv.org/abs/2310.08278">https://arxiv.org/abs/2310.08278</a></td></tr><tr><td align="center"><p>Predict performance of <strong>time-series forecasting</strong> algorithms.</p><p> <a href="https://doi.org/10.1016/j.eswa.2022.119023">📗 <em><mark style="color:green;">Expert Systems with Applications</mark></em><mark style="color:green;"> (2023).</mark></a></p></td><td><a href="/files/3dGcdNsCk4TuM7ekGbbd">/files/3dGcdNsCk4TuM7ekGbbd</a></td><td><a href="https://linkinghub.elsevier.com/retrieve/pii/S0957417422020413">https://linkinghub.elsevier.com/retrieve/pii/S0957417422020413</a></td></tr><tr><td align="center"><p>Track optimization trajectories for <strong>combinatorial optimization</strong> problems.</p><p><a href="https://doi.org/10.48550/arXiv.2211.15368">📙 <em><mark style="color:orange;">arXiv</mark></em><mark style="color:orange;"> (2022)</mark></a><mark style="color:orange;">.</mark></p></td><td><a href="/files/zujPyT2xsjEvhSWrPl5W">/files/zujPyT2xsjEvhSWrPl5W</a></td><td><a href="https://arxiv.org/abs/2211.15368">https://arxiv.org/abs/2211.15368</a></td></tr><tr><td align="center"><p>Extend <strong>shapelets</strong> for time-series classification.</p><p> <a href="https://www.mdpi.com/2076-3417/12/17/8685">📗 <em><mark style="color:green;">Applied Sciences</mark></em><mark style="color:green;"> (2022)</mark></a><mark style="color:green;">.</mark></p></td><td><a href="/files/wNX5QCUYnGaxg9Tw7EjG">/files/wNX5QCUYnGaxg9Tw7EjG</a></td><td><a href="https://www.mdpi.com/2076-3417/12/17/8685">https://www.mdpi.com/2076-3417/12/17/8685</a></td></tr><tr><td align="center"><p>Perform <strong>meta-learning</strong> for <strong>time-series forecasting</strong>.</p><p> <a href="https://doi.org/10.1109/ACCESS.2021.3074891">📗 <em><mark style="color:green;">IEEE Access</mark></em><mark style="color:green;"> (2021)</mark></a><mark style="color:green;">.</mark></p></td><td><a href="/files/O62ZEK2ofIdpA9WCPjPP">/files/O62ZEK2ofIdpA9WCPjPP</a></td><td><a href="https://ieeexplore.ieee.org/document/9410467">https://ieeexplore.ieee.org/document/9410467</a></td></tr><tr><td align="center">Evaluate time-series imputation methods.<br>📗<a href="https://www.researchgate.net/profile/Mourad-Khayati/publication/383497799_ImputeVIS_An_Interactive_Evaluator_to_Benchmark_Imputation_Techniques_for_Time_Series_Data/links/66d039dbfa5e11512c3cea73/ImputeVIS-An-Interactive-Evaluator-to-Benchmark-Imputation-Techniques-for-Time-Series-Data.pdf"><em><mark style="color:green;">Proc. VLDB Endowment</mark></em><mark style="color:green;"> (2024).</mark></a><br></td><td><a href="/files/WjjJyIb7bMxeFnCSZTsB">/files/WjjJyIb7bMxeFnCSZTsB</a></td><td><a href="https://www.researchgate.net/publication/383497799_ImputeVIS_An_Interactive_Evaluator_to_Benchmark_Imputation_Techniques_for_Time_Series_Data">https://www.researchgate.net/publication/383497799_ImputeVIS_An_Interactive_Evaluator_to_Benchmark_Imputation_Techniques_for_Time_Series_Data</a></td></tr></tbody></table>

***

## Tutorials

<table data-view="cards" data-full-width="false"><thead><tr><th></th><th></th><th></th><th data-hidden data-card-cover data-type="files"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td></td><td>Using <em>catch22</em> features in <a href="https://www.sktime.net/en/stable/examples/transformation/catch22.html">sktime</a>.</td><td></td><td><a href="/files/K7R92zDtgtTXnGcR8Zhn">/files/K7R92zDtgtTXnGcR8Zhn</a></td><td><a href="https://www.sktime.net/en/stable/examples/transformation/catch22.html">https://www.sktime.net/en/stable/examples/transformation/catch22.html</a></td></tr></tbody></table>

***


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://time-series-features.gitbook.io/catch22/welcome-to-catch22/publications-using-catch22.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
