Skip to content
2000
Volume 3, Issue 1
  • ISSN: 2950-3779
  • E-ISSN: 2950-3787

Abstract

In the last few years, Neural Networks have become more common in different areas due to their ability to learn intricate patterns and provide precise predictions. Nonetheless, creating an efficient neural network model is a difficult task that demands careful thought of multiple factors, such as architecture, optimization method, and regularization technique. This paper aims to comprehensively overview the state-of-the-art artificial neural network (ANN) generation and highlight key challenges and opportunities in machine learning applications. It provides a critical analysis of current neural network model design methodologies, focusing on the strengths and weaknesses of different approaches. Also, it explores the use of different deep neural networks (DNN) in image recognition, natural language processing, and time series analysis. In addition, the text explores the advantages of selecting optimal values for various components of an Artificial Neural Network (ANN). These components include the number of input/output layers, the number of hidden layers, the type of activation function used, the number of epochs, and the model type selection. Setting these components to their ideal values can help enhance the model's overall performance and generalization. Furthermore, it identifies some common pitfalls and limitations of existing design methodologies, such as overfitting, lack of interpretability, and computational complexity. Finally, it proposes some directions for future research, such as developing more efficient and interpretable neural network architectures, improving the scalability of training algorithms, and exploring the potential of new paradigms, such as Spiking Neural Networks, quantum neural networks, and neuromorphic computing.

Loading

Article metrics loading...

/content/journals/cucs/10.2174/0129503779282967240315040931
2024-04-05
2025-04-18
Loading full text...

Full text loading...

References

  1. KreuzbergerD. KühlN. HirschlS. Machine learning operations (mlops): Overview, definition, and architecture.IEEE Access202311318663187910.1109/ACCESS.2023.3262138
    [Google Scholar]
  2. ShuX. YeY. Knowledge discovery: Methods from data mining and machine learning.Soc. Sci. Res.202311010281710.1016/j.ssresearch.2022.10281736796993
    [Google Scholar]
  3. YousifJ.H. SainiD.K. Big data analysis on smart tools and techniques.Cyber Defense Mechanisms202011113010.1201/9780367816438‑8
    [Google Scholar]
  4. HuW. LiX. LiC. LiR. JiangT. SunH. HuangX. GrzegorzekM. LiX. A state-of-the-art survey of artificial neural networks for Whole-slide Image analysis: From popular convolutional neural networks to potential visual transformers.Comput. Biol. Med.202316110703410.1016/j.compbiomed.2023.10703437230019
    [Google Scholar]
  5. YousifJ.H. KazemH.A. Al-BalushiH. AbuhmaidanK. Al-BadiR. Artificial Neural network modelling and experimental evaluation of dust and thermal energy impact on monocrystalline and polycrystalline photovoltaic modules.Energies20221511413810.3390/en15114138
    [Google Scholar]
  6. GawlikowskiJ. TassiC.R.N. AliM. LeeJ. HumtM. FengJ. KruspeA. TriebelR. JungP. RoscherR. ShahzadM. YangW. BamlerR. ZhuX.X. A survey of uncertainty in deep neural networks.Artif. Intell. Rev.202356S11513158910.1007/s10462‑023‑10562‑9
    [Google Scholar]
  7. MarianiM.M. MachadoI. MagrelliV. DwivediY.K. Artificial intelligence in innovation research: A systematic review, conceptual framework, and future research directions.Technovation202312210262310.1016/j.technovation.2022.102623
    [Google Scholar]
  8. YilmazI. KaynarO. Multiple regression, ANN (RBF, MLP) and ANFIS models for prediction of swell potential of clayey soils.Expert Syst. Appl.20113855958596610.1016/j.eswa.2010.11.027
    [Google Scholar]
  9. SinghV. GangsarP. PorwalR. AtulkarA. Artificial intelligence application in fault diagnostics of rotating industrial machines: A state-of-the-art review.J. Intell. Manuf.202334393196010.1007/s10845‑021‑01861‑5
    [Google Scholar]
  10. WuX. NiuR. RenF. PengL. Landslide susceptibility mapping using rough sets and back-propagation neural networks in the Three Gorges, China.Environ. Earth Sci.20137031307131810.1007/s12665‑013‑2217‑2
    [Google Scholar]
  11. LinY. MaJ. WangQ. SunD.W. Applications of machine learning techniques for enhancing nondestructive food quality and safety detection.Crit. Rev. Food Sci. Nutr.202363121649166910.1080/10408398.2022.213172536222697
    [Google Scholar]
  12. Larabi-Marie-SainteS. Bin AlamirM. AlameerA. Arabic text clustering using self-organizing maps and grey wolf optimization.Appl. Sci.202313181016810.3390/app131810168
    [Google Scholar]
  13. SongH. KimM. ParkD. ShinY. LeeJ.G. Learning from noisy labels with deep neural networks: A survey.IEEE Trans. Neural Netw. Learn. Syst.202234118135815335254993
    [Google Scholar]
  14. YousifJ. Neural computing based part of speech tagger for Arabic language: A review study.Int. J. Comp. Appl. Sci. IJOCAAS201851
    [Google Scholar]
  15. SahooS. KumarS. AbedinM.Z. LimW.M. JakharS.K. Deep learning applications in manufacturing operations: A review of trends and ways forward.J. Enterp. Inf. Manag.202336122125110.1108/JEIM‑01‑2022‑0025
    [Google Scholar]
  16. MehrishA. MajumderN. BharadwajR. MihalceaR. PoriaS. A review of deep learning techniques for speech processing.Inf. Fusion20239910186910.1016/j.inffus.2023.101869
    [Google Scholar]
  17. GheisariM. EbrahimzadehF. RahimiM. MoazzamigodarziM. LiuY. Dutta PramanikP.K. HeraviM.A. MehbodniyaA. GhaderzadehM. FeylizadehM.R. KosariS. Deep learning: Applications, architectures, models, tools, and frameworks: A comprehensive survey.CAAI Trans. Intell. Technol.20238358160610.1049/cit2.12180
    [Google Scholar]
  18. YousifJ.H. SembokT. Arabic part-of-speech tagger based neural networksproceedings of International Arab Conference on Information Technology20222224
    [Google Scholar]
  19. GarcezA.D.A. LambL.C. Neurosymbolic AI: The 3rd wave.Artif. Intell. Rev.2023120
    [Google Scholar]
  20. ShethA. RoyK. GaurM. Neurosymbolic ai-why, what, and howarXiv:2305.008132023
    [Google Scholar]
  21. KrenzerA. HeilS. FittingD. MattiS. ZollerW.G. HannA. PuppeF. Automated classification of polyps using deep learning architectures and few-shot learning.BMC Med. Imaging20232315910.1186/s12880‑023‑01007‑437081495
    [Google Scholar]
  22. JavedA.R. AhmedW. PandyaS. MaddikuntaP.K.R. AlazabM. GadekalluT.R. A survey of explainable artificial intelligence for smart cities.Electronics2023124102010.3390/electronics12041020
    [Google Scholar]
  23. TangN. GongS. ZhouJ. ShenM. GaoT. Generative visual common sense: Testing analysis-by-synthesis on Mondrian-style image.J. Exp. Psychol. Gen.2023152102713273410.1037/xge000141337199976
    [Google Scholar]
  24. PanyS.S. SinghS.G. KarS. DikshitB. Stochastic modelling of diffused and specular reflector efficiencies for scintillation detectors.ournal of Optics20232411210.1007/s12596‑023‑01190‑1
    [Google Scholar]
  25. XiL. TangW. WanT. TreeNet: Structure preserving multi-class 3D point cloud completion.Pattern Recognit.202313910947610.1016/j.patcog.2023.109476
    [Google Scholar]
  26. AhmedF.Y.H. MasliA.A. KhassawnehB. YousifJ.H. ZebariD.A. Optimized downlink scheduling over lte network based on artificial neural network.Computers202312917910.3390/computers12090179
    [Google Scholar]
  27. ChunduriR.K. PereraD.G. Neuromorphic sentiment analysis using spiking neural networks.Sensors20232318770110.3390/s2318770137765758
    [Google Scholar]
  28. YiZ. LianJ. LiuQ. ZhuH. LiangD. LiuJ. Learning rules in spiking neural networks: A survey.Neurocomputing202353116317910.1016/j.neucom.2023.02.026
    [Google Scholar]
  29. AliY.H. AhmedF.Y. AbdelrhmanA.M. AliS.M. BorhanaA.A. HamzahR.I.R. Novel spiking neural network model for gear fault diagnosis2022 2nd International Conference on Emerging Smart Technologies and Applications 25-26 October, Ibb, Yemen20221610.1109/eSmarTA56775.2022.9935414
    [Google Scholar]
  30. BudachL. FeuerpfeilM. IhdeN. NathansenA. NoackN. PatzlaffH. NaumannF. HarmouchH. The effects of data quality on machine learning performancearXiv:2207.145292022
    [Google Scholar]
  31. LiN. MaL. XingT. YuG. WangC. WenY. ChengS. GaoS. Automatic design of machine learning via evolutionary computation: A survey.Appl. Soft Comput.202314311041210.1016/j.asoc.2023.110412
    [Google Scholar]
  32. LiuS. LinQ. LiJ. A survey on learnable evolutionary algorithms for scalable multiobjective optimization.IEEE Transactions on Evolutionary Computation. 276, 202319411961
    [Google Scholar]
  33. LiT. MerkelC. Model extraction and adversarial attacks on neural networks using switching power informationInternational Conference on Artificial Neural Networks202130. no. Part I91101
    [Google Scholar]
  34. MaierH.R. GalelliS. RazaviS. CastellettiA. RizzoliA. AthanasiadisI.N. Sànchez-MarrèM. AcutisM. WuW. HumphreyG.B. Exploding the myths: An introduction to artificial neural networks for prediction and forecasting.Environ. Model. Softw.202316710577610.1016/j.envsoft.2023.105776
    [Google Scholar]
  35. YousifJ. Implementation of big data analytics for simulating, predicting & optimizing the solar energy productionAppl. comput. J.2021113314010.52098/acj.202140
    [Google Scholar]
  36. ZhangY. TiňoP. LeonardisA. TangK. A survey on neural network interpretability.IEEE Trans. Emerg. Top. Comput. Intell.20215572674210.1109/TETCI.2021.3100641
    [Google Scholar]
  37. SamekW. MontavonG. LapuschkinS. AndersC.J. MüllerK.R. Explaining deep neural networks and beyond: A review of methods and applications.Proc. IEEE2021109324727810.1109/JPROC.2021.3060483
    [Google Scholar]
  38. AlkishriW. Al-BahriM. Deepfake image detection methods using discrete fourier transform analysis and convolutional neural network.Journal of Jilin University.Online Open Access2023
    [Google Scholar]
  39. KhamisY. YousifJ.H. Deep learning feedforward neural network in predicting model of environmental risk factors in the sohar region.Arti. Intel. & Robo. Devel. J.20221201
    [Google Scholar]
  40. AlkishriW. WidyartoS. YousifJ.H. Al-BahriM. Fake face detection based on colour textual analysis using deep convolutional neural network.J. Int. Ser. Info. Sec.202313314315510.58346/JISIS.2023.I3.009
    [Google Scholar]
  41. YousifJ.H. SembokT. Design and implement an automatic neural tagger based arabic language for nlp applications.Asian J. Info. Techno.200657784789
    [Google Scholar]
  42. RadhoushS. WhitakerB.M. NehrirH. An overview of supervised machine learning approaches for applications in active distribution networks.Energies20231616597210.3390/en16165972
    [Google Scholar]
  43. EltounyK. GomaaM. LiangX. Unsupervised learning methods for data-driven vibration-based structural health monitoring: A review.Sensors2023236329010.3390/s2306329036992001
    [Google Scholar]
  44. BagherzadehJ. AsilH. A review of various semi-supervised learning models with a deep learning and memory approach.Iran J. Comput. Sci.201922658010.1007/s42044‑018‑00027‑6
    [Google Scholar]
  45. SankariS.S. KumarP.S. A review of deep transfer learning strategy for energy forecasting.Nat. Environ. Poll. Techno.20232241781179310.46488/NEPT.2023.v22i04.007
    [Google Scholar]
  46. RayA. KolekarM.H. BalasubramanianR. HafianeA. Transfer learning enhanced vision-based human activity recognition: A decade-long analysis.Intern. J. Info.Manag. Data Ins.20233110014210.1016/j.jjimei.2022.100142
    [Google Scholar]
  47. FigueiredoE. Omori YanoM. da SilvaS. MoldovanI. Adrian BudM. Transfer learning to enhance the damage detection performance in bridges when using numerical models.J. Bridge Eng.20232810402213410.1061/(ASCE)BE.1943‑5592.0001979
    [Google Scholar]
  48. KhanN.S. GhaniM.S. A survey of deep learning based models for human activity recognition.Wirel. Pers. Commun.202112021593163510.1007/s11277‑021‑08525‑w
    [Google Scholar]
  49. PomazanV. TvoroshenkoI. GorokhovatskyiV. Development of an application for recognizing emotions using convolutional neural networks.IJAISR2023772536
    [Google Scholar]
  50. SrivastavaD. SharmaN. SinwarD. YousifJ.H. GuptaH.P. Intelligent internet of things for smart healthcare systems.Boca RatonCRC Press2023126810.1201/9781003326182
    [Google Scholar]
  51. AlamM.S. MohamedF.B. SelamatA. HossainA.B. A review of recurrent neural network based camera localization for indoor environments.IEEE Access202311439854400910.1109/ACCESS.2023.3272479
    [Google Scholar]
  52. ChaY.O. IhalageA.A. HaoY. Antennas and propagation research from large-scale unstructured data with machine learning: A review and predictions.IEEE Antennas Propag. Mag.2023655102410.1109/MAP.2023.3290385
    [Google Scholar]
  53. LiP. PeiY. LiJ. A comprehensive survey on design and application of autoencoder in deep learning.Appl. Soft Comput.202313811017610.1016/j.asoc.2023.110176
    [Google Scholar]
  54. SeghiourA. AbbasH.A. ChouderA. RabhiA. Deep learning method based on autoencoder neural network applied to faults detection and diagnosis of photovoltaic system.Simul. Model. Pract. Theory202312310270410.1016/j.simpat.2022.102704
    [Google Scholar]
  55. WangR. BashyamV. YangZ. YuF. TassopoulouV. ChintapalliS.S. SkampardoniI. SreepadaL.P. SahooD. NikitaK. AbdulkadirA. WenJ. DavatzikosC. Applications of generative adversarial networks in neuroimaging and clinical neuroscience.Neuroimage202326911989810.1016/j.neuroimage.2023.11989836702211
    [Google Scholar]
  56. GuiJ. SunZ. WenY. TaoD. YeJ. A review on generative adversarial networks: Algorithms, theory, and applications.IEEE Trans. Knowl. Data Eng.20233543313333210.1109/TKDE.2021.3130191
    [Google Scholar]
  57. XieP. ZhouA. ChaiB. The application of long short-term memory (LSTM) method on displacement prediction of multifactor-induced landslides.IEEE Access20197543055431110.1109/ACCESS.2019.2912419
    [Google Scholar]
  58. FangL. ShaoD. Application of long short-term memory (LSTM) on the prediction of rainfall-runoff in karst area.Front. Phys.2022979068710.3389/fphy.2021.790687
    [Google Scholar]
  59. GuoZ. YangC. WangD. LiuH. A novel deep learning model integrating CNN and GRU to predict particulate matter concentrations.Process Saf. Environ. Prot.202317360461310.1016/j.psep.2023.03.052
    [Google Scholar]
  60. LiX. ZouN. WangZ. Application of a deep learning fusion model in fine particulate matter concentration prediction.Atmosphere202314581610.3390/atmos14050816
    [Google Scholar]
  61. KrichenM. Convolutional neural networks: A survey.Computers202312815110.3390/computers12080151
    [Google Scholar]
  62. WangJ. LiX. LiJ. SunQ. WangH. NGCU: A new RNN model for time-series data prediction.Big Data Research20222710029610.1016/j.bdr.2021.100296
    [Google Scholar]
  63. MeyesR. DonauerJ. SchmeingA. MeisenT. A recurrent neural network architecture for failure prediction in deep drawing sensory time series data.Procedia Manuf.20193478979710.1016/j.promfg.2019.06.205
    [Google Scholar]
  64. McQueenT.A. HopgoodA.A. TepperJ.A. AllenT.J. A recurrent self-organizing map for temporal sequence processing. In: Applications and science in soft computing.Springer Berlin Heidelberg20043810.1007/978‑3‑540‑45240‑9_1
    [Google Scholar]
  65. KimJ. JungW. AnJ. OhH.J. ParkJ. Self-optimization of training dataset improves forecasting of cyanobacterial bloom by machine learning.Sci. Total Environ.202386616139810.1016/j.scitotenv.2023.16139836621510
    [Google Scholar]
  66. ShahA. ShahM. PandyaA. SushraR. SushraR. MehtaM. PatelK. PatelK. A comprehensive study on skin cancer detection using artificial neural network (ANN) and convolutional neural networkClin. eHeal.202367684
    [Google Scholar]
  67. EsfeM.H. EftekhariS.A. AlizadehA. EmamiN. ToghraieD. Investigation of best artificial neural network topology to model the dynamic viscosity of MWCNT-ZnO/SAE 5W30 nano-lubricant.Mater. Today Commun.20233510607410.1016/j.mtcomm.2023.106074
    [Google Scholar]
  68. HarithaK. ShaileshS. JudyM.V. RavichandranK.S. KrishankumarR. GandomiA.H. A novel neural network model with distributed evolutionary approach for big data classification.Sci. Rep.20231311105210.1038/s41598‑023‑37540‑z37422487
    [Google Scholar]
  69. ChenH. TaoR. FanY. WangY. WangJ. SchieleB. XieX. RajB. SavvidesM. Softmatch: Addressing the quantity-quality trade-off in semi-supervised learningarXiv:2301.109212023
    [Google Scholar]
  70. AbdalgaderK. YousifJ.H. Agricultural irrigation control using sensor-enabled architecture.Trans. Internet Inf. Syst.20221610
    [Google Scholar]
  71. BesbesO. MaW. MouchtakiO. Quality vs. quantity of data in contextual decision-making: Exact analysis under newsvendor lossarXiv:2302.084242023
    [Google Scholar]
  72. BehlerJ. Four generations of high-dimensional neural network potentials.Chem. Rev.202112116100371007210.1021/acs.chemrev.0c0086833779150
    [Google Scholar]
  73. LakraS. PrasadT.V. RamakrishnaG. The future of neural networks arXiv:1209.48552012
    [Google Scholar]
  74. GhoshJ. NagA. An overview of radial basis function networks., vol. 67. Radial basis function networks2001, pp. 1-36.10.1007/978‑3‑7908‑1826‑0_1
    [Google Scholar]
  75. HowlettR.J. Radial basis function networks 1.In: Recent developments in theory and applications.Studies in Fuzziness and Soft Computing200166
    [Google Scholar]
  76. LimE.A. TanW.H. JunohA.K. An improved radial basis function networks based on quantum evolutionary algorithm for training nonlinear datasets.Intern. J. Art. Intel.201982120
    [Google Scholar]
  77. LiuW. WangZ. LiuX. ZengN. LiuY. AlsaadiF.E. A survey of deep neural network architectures and their applications.Neurocomputing2017234112610.1016/j.neucom.2016.12.038
    [Google Scholar]
  78. DengL. HintonG. KingsburyB. New types of deep neural network learning for speech recognition and related applications: An overview.In 2013 IEEE international conference on acoustics, speech and signal processing, year.201385998603IEEE
    [Google Scholar]
  79. YousifJ.H. SembokT. Recurrent neural approach based Arabic part-of-speech taggingproceedings of International Conference on Computer and Communication Engineering (ICCCE’06)20062911
    [Google Scholar]
  80. BasharD.A. Survey on evolving deep learning neural network architectures.J. Arti. Intel. Caps. Net.201920192738210.36548/jaicn.2019.2.003
    [Google Scholar]
  81. DhillonA. VermaG.K. Convolutional neural network: A review of models, methodologies and applications to object detection.Prog. Arti. Intel.2020928511210.1007/s13748‑019‑00203‑0
    [Google Scholar]
  82. SutskeverI. MartensJ. HintonG.E. Generating text with recurrent neural networksProceedings of the 28th international conference on machine learning (ICML-11) year201110171024
    [Google Scholar]
  83. SamekW. MüllerK.R. Towards explainable artificial intelligence.Explainable AI: Interpreting, Explaining and Visualizing Deep Learning.Springer201952210.1007/978‑3‑030‑28954‑6_1
    [Google Scholar]
  84. TjoaE. GuanC. A survey on explainable artificial intelligence (xai): Toward medical xai.IEEE Trans. Neural Netw. Learn. Syst.202132114793481310.1109/TNNLS.2020.302731433079674
    [Google Scholar]
  85. WuY. DengL. LiG. ZhuJ. XieY. ShiL. Direct training for spiking neural networks: Faster, larger, better.Proc. Conf. AAAI Artif. Intell.20193311311131810.1609/aaai.v33i01.33011311
    [Google Scholar]
  86. TavanaeiA. GhodratiM. KheradpishehS.R. MasquelierT. MaidaA. Deep learning in spiking neural networks.Neural Netw.2019111476310.1016/j.neunet.2018.12.00230682710
    [Google Scholar]
  87. Paugam-MoisyH. BohteS.M. Computing with spiking neuron networks. Handbook of natural computing. 2012, pp. 1-47.201214710.1007/978‑3‑540‑92910‑9_10
    [Google Scholar]
  88. MaguireL.P. McGinnityT.M. GlackinB. GhaniA. BelatrecheA. HarkinJ. Challenges for large-scale implementations of spiking neural networks on FPGAs.Neurocomputing2007711-3132910.1016/j.neucom.2006.11.029
    [Google Scholar]
  89. NunesJ.D. CarvalhoM. CarneiroD. CardosoJ.S. Spiking neural networks: A survey.IEEE Access202210607386076410.1109/ACCESS.2022.3179968
    [Google Scholar]
  90. YonabaH. AnctilF. AnctilV. Comparing sigmoid transfer functions for neural network multistep ahead streamflow forecasting.J. Hydrol. Eng.2010154275283
    [Google Scholar]
  91. LeT.H. JangH. ShinS. Determination of the optimal neural network transfer function for response surface methodology and robust design.Appl. Sci.20211115676810.3390/app11156768
    [Google Scholar]
  92. YousifJ.H. SembokT.M.T. Arabic part-of-speech tagger based support vectors machines2008 International Symposium on Information Technology IEEE320081710.1109/ITSIM.2008.4632066
    [Google Scholar]
  93. YousifJ.H. Natural language processing based soft computing techniques.Int. J. Comput. Appl.2013778434910.5120/13418‑1089
    [Google Scholar]
  94. ShafiI. AhmadJ. ShahS.I. KashifF.M. Impact of varying neurons and hidden layers in neural network architecture for a time frequency application2006 IEEE International Multitopic Conference200618819310.1109/INMIC.2006.358160
    [Google Scholar]
  95. RamchounH. GhanouY. EttaouilM. Janati IdrissiM.A. Multilayer perceptron: Architecture optimization and training.Intern. J. Int. Mult. Art. Intel.20164130
    [Google Scholar]
  96. HeatonJ. Introduction to neural networks with Java.Heaton Research, Inc2008
    [Google Scholar]
  97. YangM. LimM.K. QuY. LiX. NiD. Deep neural networks with L1 and L2 regularization for high dimensional corporate credit risk prediction.Expert Syst. Appl.202321311887310.1016/j.eswa.2022.118873
    [Google Scholar]
  98. YousifJ. Hidden Markov Model tagger for applications based Arabic text: A reviewSSRN Electronic Journal20197110.2139/ssrn.3451440
    [Google Scholar]
  99. AdigeS. KurbanR. DurmuşA. KaraköseE. Classification of apple images using support vector machines and deep residual networks.Neural Comput. Appl.20233516120731208710.1007/s00521‑023‑08340‑3
    [Google Scholar]
  100. AlsumaieiA.A. Utility of artificial neural networks in modeling pan evaporation in hyper-arid climates.Water2020125150810.3390/w12051508
    [Google Scholar]
  101. BaiY. RELU-function and derived function reviewIn: SHS Web of Conferences vol. 144. EDP Sciences.202202006
    [Google Scholar]
  102. AlahmariF. NaimA. AlqahtaniH. E-Learning modeling technique and convolution neural networks in online education. In IoT-enabled Convolutional Neural Networks: Techniques and Applications.River Publishers2023261295
    [Google Scholar]
  103. KazemH.A. YousifJ.H. ChaichanM.T. Al-WaeliA.H.A. SopianK. Long-term power forecasting using FRNN and PCA models for calculating output parameters in solar photovoltaic generation.Heliyon202281e0880310.1016/j.heliyon.2022.e0880335128098
    [Google Scholar]
  104. YousifJ.H. KazemH.A. Prediction and evaluation of photovoltaic-thermal energy systems production using artificial neural network and experimental dataset.Case Stud. Therm. Eng.20212710129710.1016/j.csite.2021.101297
    [Google Scholar]
  105. AimenA. LadrechaB. SidheekhS. KrishnanN.C. Leveraging task variability in meta-learning.SN Computer Science20234553910.1007/s42979‑023‑01951‑6
    [Google Scholar]
  106. ShaoF. ShenZ. How can artificial neural networks approximate the brain.Front. Psychol.2023131970
    [Google Scholar]
  107. SalvadorJosé OliveiraJoão BreternitzMaurício Reinforcement learningA lit. rev.20202020136
    [Google Scholar]
/content/journals/cucs/10.2174/0129503779282967240315040931
Loading
/content/journals/cucs/10.2174/0129503779282967240315040931
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error
Please enter a valid_number test