Please use this identifier to cite or link to this item:
Type: Artigo
Title: Data-driven modeling of smartphone-based electrochemiluminescence sensor data using artificial intelligence
Author: Rivera, Elmer Ccopa
Swerdlow, Jonathan J.
Summerscales, Rodney L.
Uppala, Padma P. Tadi
Maciel Filho, Rubens
Neto, Mabio R. C.
Kwon, Hyun J.
Abstract: Understanding relationships among multimodal data extracted from a smartphone-based electrochemiluminescence (ECL) sensor is crucial for the development of low-cost point-of-care diagnostic devices. In this work, artificial intelligence (AI) algorithms such as random forest (RF) and feedforward neural network (FNN) are used to quantitatively investigate the relationships between the concentration of Ru(bpy)2+3 luminophore and its experimentally measured ECL and electrochemical data. A smartphone-based ECL sensor with Ru(bpy)2+3 /TPrA was developed using disposable screen-printed carbon electrodes. ECL images and amperograms were simultaneously obtained following 1.2-V voltage application. These multimodal data were analyzed by RF and FNN algorithms, which allowed the prediction of Ru(bpy)2+3 concentration using multiple key features. High correlation (0.99 and 0.96 for RF and FNN, respectively) between actual and predicted values was achieved in the detection range between 0.02 µM and 2.5 µM. The AI approaches using RF and FNN were capable of directly inferring the concentration of Ru(bpy)2+3 using easily observable key features. The results demonstrate that data-driven AI algorithms are effective in analyzing the multimodal ECL sensor data. Therefore, these AI algorithms can be an essential part of the modeling arsenal with successful application in ECL sensor data modeling
Subject: Telefone celular
Country: Suiça
Editor: MDPI
Rights: Aberto
Identifier DOI: 10.3390/s20030625
Date Issue: 2020
Appears in Collections:FEQ - Artigos e Outros Documentos

Files in This Item:
File Description SizeFormat 
000517786200049.pdf4.09 MBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.