write_catalog

write_catalog#

Catalog.write_catalog(base_catalog_path: str | Path | UPath, *, catalog_name: str | None = None, default_columns: list[str] | None = None, as_collection: bool = True, overwrite: bool = False, create_thumbnail: bool = True, error_if_empty: bool = True, **kwargs)[source]#

Save the catalog to disk in HATS format.

Parameters:
base_catalog_pathstr | Path | UPath,

Location where catalog is saved to

catalog_namestr

The name of the catalog to be saved

default_columnslist[str]

A metadata property with the list of the columns in the catalog to be loaded by default. By default, uses the default columns from the original hats catalog if they exist.

as_collectionbool, default True

If True, saves the catalog and its margin as a collection

overwritebool, default False

If True existing catalog is overwritten

error_if_emptybool, default True

If True, raises an error if the catalog is empty.

**kwargs

Arguments to pass to the parquet write operations

Examples

Write a small synthetic catalog to disk:

>>> import lsdb
>>> from lsdb.nested.datasets import generate_data
>>> nf = generate_data(1000, 5, seed=0, ra_range=(0.0, 300.0), dec_range=(-50.0, 50.0))
>>> catalog = lsdb.from_dataframe(nf.compute()[["ra", "dec", "id"]], catalog_name="demo")
>>> catalog.write_catalog(<your path here> / "demo_catalog", overwrite=True)