Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

on disk reading of a large h5ad file #254

Open
hoondy opened this issue Nov 19, 2022 · 0 comments
Open

on disk reading of a large h5ad file #254

hoondy opened this issue Nov 19, 2022 · 0 comments

Comments

@hoondy
Copy link
Contributor

hoondy commented Nov 19, 2022

I am working with a large h5ad file and it requires very large memory to load the data using pg.read_input. AnnData allows users to read an h5ad file partially on memory (.obs or .var) and leave the count data (.X) on disk.

See https://anndata-tutorials.readthedocs.io/en/latest/getting-started.html#Partial-reading-of-large-data
See also https://anndata.readthedocs.io/en/latest/fileformat-prose.html

I was wondering if something similar can be done using pegasus. Single-cell data is getting larger and I feel like this should be a useful option for users working with a limited amount of memory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant