柯林斯词典Markov chain /ˈmɑːkɒf/ 1. N a sequence of events the probability for each of which is dependent only on the event immediately preceding it 馬爾可夫鏈[statistics] 返回 Markov chain