Yahoo Suche Web Suche

  1. ietresearch.onlinelibrary.wiley.com wurde im letzten Monat von mehr als 10.000 Nutzern besucht

    From machine learning to big data & knowledge engineering, submit your research. IET journal sponsored by the Chinese Association for Artificial Intelligence. Submit today

Suchergebnisse

  1. Suchergebnisse:
  1. 20. Okt. 2023 · This study presents the concept of a computationally efficient machine learning (ML) model for diagnosing and monitoring Parkinson’s disease (PD) using rest-state EEG signals (rs-EEG) from 20 PD subjects and 20 normal control (NC) subjects at a sampling rate of 128 Hz. Based on the comparative analysis of the effectiveness of entropy calculation methods, fuzzy entropy showed the best results ...

  2. Cross-entropy, also known as logarithmic loss or log loss, is a popular loss function used in machine learning to measure the performance of a classification model. Namely, it measures the difference between the discovered probability distribution of a classification model and the predicted values. Binary cross-entropy is used when performing ...

  3. 21. Sept. 2020 · Here, we find that a machine learning algorithm that is trained to infer the direction of time’s arrow identifies entropy production as the relevant physical quantity in its decision-making ...

  4. 10. Dez. 2020 · Information Gain and Mutual Information for Machine Learning. By Jason Brownlee on December 10, 2020 in Probability 58. Information gain calculates the reduction in entropy or surprise from transforming a dataset in some way. It is commonly used in the construction of decision trees from a training dataset, by evaluating the information gain ...

  5. 30. Aug. 2020 · As expected, the cross-entropy is greater than the entropy. Applications in machine learning. I’ve laid out some neat facts about entropy, but it’s not clear why we should care about it. To see why it’s useful, suppose we are building a logistic regression model to classify points into two possible classes, 0 and 1.

  6. 12. Apr. 2021 · In the present day, its core fundamentals are applied in the fields of lossless data compression, lossy data compression and channel coding. The techniques used in Information Theory are probabilistic in nature and usually deal with 2 specific quantities, viz. Entropy and Mutual Information.

  7. 1. Jan. 2023 · (b) Annual publications retrieved with the keywords “machine learning material” and “machine learning high-entropy alloys” (inset plot) from ScienceDirect (blue) within the subject area of materials science, Nature Portfolio journals (green), and Physical Review journals (red). Search date: July 26, 2022. (c) Time table of a few ...