close button

अंग्रेजी मे अर्थ[+]

Meaning of MARKOV CHAIN in English
  1. a Markov process for which the parameter is discrete time values
  2. (statistics) a random process (markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. it differs from the more general markov process in that the states of a markov chain are discrete rather than continuous. certain physical processes, such as diffusion of a molecule in a fluid, are modelled as a markov chai see also random walk.
MoreLess

संबंधित शब्द[+]

'MARKOV CHAIN' word's Synonyms and Antonyms

There are no Examples & Usage in our Dictionary.
डिक्शनरी सर्च

मुहावरे/लोकोक्तियाँ

और भी

आज का शब्द

English to Hindi Dictionary

आज का विचार

न्याययुक्त व्यवहार करना, सौंदर्य से प्रेम करना तथा सत्य की भावना को ह्रदय में धारण करके विनयशील बने रहना ही सबसे बड़ा धर्म है। - डॉ. सर्वपल्ली राधाकृष्णन
और भी

शब्द रसोई से

Cookery Words
फोटो गैलरी