Downsampling#
When an interactive table is displayed by itables
, the table data is embedded in the notebook output. As we don’t want your notebook to become super heavy just because you displayed a large table, we have a downsampling mechanism in place.
When the data in a table is larger than maxBytes
, which is equal to 64KB by default, itables
will display only a subset of the table - one that fits into maxBytes
, and display a warning that points to the itables
documentation.
If you wish, you can increase the value of maxBytes
or even deactivate the limit (with maxBytes=0
). Similarly, you can set a limit on the number of rows (maxRows
, defaults to 0) or columns (maxColumns
, defaults to 200
).
import itables.options as opt
from itables import init_notebook_mode, show
from itables.downsample import as_nbytes, nbytes
from itables.sample_dfs import get_indicators
init_notebook_mode(all_interactive=True)
opt.lengthMenu = [2, 5, 10, 20, 50, 100, 200, 500]
opt.maxBytes = "8KB"
df = get_indicators()
as_nbytes(opt.maxBytes), nbytes(df)
(8192, 28000)
df
id | name | unit | source | sourceNote | sourceOrganization | topics |
---|---|---|---|---|---|---|
Loading ITables v2.2.4 from the init_notebook_mode cell...
(need help?) |
To show the table in full, we can modify the value of maxBytes
either locally:
show(df, maxBytes=0)
id | name | unit | source | sourceNote | sourceOrganization | topics |
---|---|---|---|---|---|---|
Loading ITables v2.2.4 from the init_notebook_mode cell...
(need help?) |
or globally:
opt.maxBytes = "1MB"
df
id | name | unit | source | sourceNote | sourceOrganization | topics |
---|---|---|---|---|---|---|
Loading ITables v2.2.4 from the init_notebook_mode cell...
(need help?) |