Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot sink_parquet when using .is_in() inside of pl.when()/then() in polars > 0.20.19 #15767

Closed
2 tasks done
dhruvyy opened this issue Apr 19, 2024 · 2 comments · Fixed by #20052
Closed
2 tasks done
Assignees
Labels
accepted Ready for implementation bug Something isn't working P-medium Priority: medium python Related to Python Polars regression Issue introduced by a new release

Comments

@dhruvyy
Copy link

dhruvyy commented Apr 19, 2024

Checks

  • I have checked that this issue has not already been reported.
  • I have confirmed this bug exists on the latest version of Polars.

Reproducible example

import polars as pl


def main() -> None:
    df = pl.LazyFrame({"a": [1, 2, 3], "b": ["a", "b", "c"]})

    df = df.with_columns(
        pl.when((pl.col("a") == 1) & (pl.col("b").is_in(["a", "b"])))
        .then(pl.lit("1"))
        .otherwise(pl.lit("0"))
        .alias("1_check")
    )

    df.sink_ipc("polars_bug.parquet")


if __name__ == "__main__":
    main()

Log output

Traceback (most recent call last):
  File "/home/user/polars_bug/polars_bug.py", line 19, in <module>
    main()
  File "/home/user/polars_bug/polars_bug.py", line 14, in main
    df.sink_parquet("polars_bug.parquet")
  File "/home/user/.cache/pypoetry/virtualenvs/polars-bug-5_dyLeoN-py3.10/lib/python3.10/site-packages/polars/_utils/unstable.py", line 59, in wrapper
    return function(*args, **kwargs)
  File "/home/user/.cache/pypoetry/virtualenvs/polars-bug-5_dyLeoN-py3.10/lib/python3.10/site-packages/polars/lazyframe/frame.py", line 1962, in sink_parquet
    return lf.sink_parquet(
polars.exceptions.InvalidOperationError: sink_Parquet(ParquetWriteOptions { compression: Zstd(None), statistics: true, row_group_size: None, data_pagesize_limit: None, maintain_order: true }) not yet supported in standard engine. Use 'collect().write_parquet()'

Issue description

Details

  • In polars 0.20.19, when I ran an operation with .is_in and then ran df.sink_parquet("path") it worked perfectly fine.
  • In polars > 0.20.19, I get the aforementioned polars.exceptions.InvalidOperationError

Observations

  • I have tried all the .sink_ methods and none of them work.

Expected behavior

The file should save in streaming mode like it did with polars 0.20.19

Installed versions

--------Version info---------
Polars:               0.20.22-rc.1
Index type:           UInt32
Platform:             Linux-5.15.146.1-microsoft-standard-WSL2-x86_64-with-glibc2.35
Python:               3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0]

----Optional dependencies----
adbc_driver_manager:  <not installed>
cloudpickle:          <not installed>
connectorx:           <not installed>
deltalake:            <not installed>
fastexcel:            <not installed>
fsspec:               <not installed>
gevent:               <not installed>
hvplot:               <not installed>
matplotlib:           <not installed>
nest_asyncio:         <not installed>
numpy:                <not installed>
openpyxl:             <not installed>
pandas:               <not installed>
pyarrow:              <not installed>
pydantic:             <not installed>
pyiceberg:            <not installed>
pyxlsb:               <not installed>
sqlalchemy:           <not installed>
xlsx2csv:             <not installed>
xlsxwriter:           <not installed>
@dhruvyy dhruvyy added bug Something isn't working needs triage Awaiting prioritization by a maintainer python Related to Python Polars labels Apr 19, 2024
@duskmoon314
Copy link
Contributor

This is not a python only issue. I can reproduce this bug in Rust.

use polars::prelude::*;

fn main() -> anyhow::Result<()> {
    let df = df!(
        "a" => &[1, 2, 3],
        "b" => &["a", "b", "c"]
    )?;

    let df = df.lazy().with_column(
        when(
            col("a")
                .eq(1)
                .and(col("b").is_in(lit(Series::from_iter(["a", "b"])))),
        )
        .then(lit("1"))
        .otherwise(lit("0"))
        .alias("1_check"),
    );

    df.sink_parquet("./tmp.parquet", ParquetWriteOptions::default())?;

	Ok(())
}

The output is:

Error: invalid operation: sink_Parquet(ParquetWriteOptions { compression: Zstd(None), statistics: StatisticsOptions { min_value: true, max_value: true, distinct_count: false, null_count: true }, row_group_size: None, data_pagesize_limit: None, maintain_order: false }) not yet supported in standard engine. Use 'collect().write_parquet()'

Version:

rustc 1.81.0-nightly (c1b336cb6 2024-06-21)

polars = { version = "0.41.2", features = [
    "dtype-u8",
    "dtype-u16",
    "is_in",
    "lazy",
    "parquet",
    "streaming",
] }

@lmocsi
Copy link

lmocsi commented Nov 20, 2024

This bug is still present in polars==1.14.0 with sink_ipc() and sink_parquet(), as well. :(

@nameexhaustion nameexhaustion changed the title .is_in method not compliant with sink_parquet in polars > 0.20.19 Cannot sink_parquet when using .is_in() inside of pl.when()/then() in polars > 0.20.19 Nov 21, 2024
@nameexhaustion nameexhaustion added accepted Ready for implementation P-medium Priority: medium and removed needs triage Awaiting prioritization by a maintainer labels Nov 21, 2024
@nameexhaustion nameexhaustion self-assigned this Nov 21, 2024
@github-project-automation github-project-automation bot moved this to Ready in Backlog Nov 21, 2024
@nameexhaustion nameexhaustion added the regression Issue introduced by a new release label Nov 21, 2024
@github-project-automation github-project-automation bot moved this from Ready to Done in Backlog Nov 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Ready for implementation bug Something isn't working P-medium Priority: medium python Related to Python Polars regression Issue introduced by a new release
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

4 participants