Normal numbers and computer science

Émile Borel defined normality more than 100 years ago to formalize the most basic form of randomness for real numbers. A number is normal to a given integer base if its expansion in that base is such that all blocks of digits of the same length occur in it with the same limiting frequency. This chap...

Descripción completa

Detalles Bibliográficos
Publicado: 2018
Acceso en línea:https://bibliotecadigital.exactas.uba.ar/collection/paper/document/paper_22970215_v_n9783319691510_p233_Becher
http://hdl.handle.net/20.500.12110/paper_22970215_v_n9783319691510_p233_Becher
Aporte de:

Ejemplares similares