- Does it handle nested/repeated Parquet columns?
- Yes — Dremel encoding (nested repeated fields) is decoded and represented as nested JSON objects/arrays, matching how Spark and BigQuery surface nested Parquet columns.
- Is there a file size limit?
- Files up to 100MB can be processed in-browser. Larger files should be processed with a command-line tool like parquet-tools or pyarrow — the browser memory footprint of large Parquet files is significant.
- Does it support all Parquet logical types?
- Standard logical types are supported: STRING, INT (8/16/32/64), FLOAT, DOUBLE, BOOLEAN, DATE, TIME, TIMESTAMP, DECIMAL, UUID, and JSON. Custom logical types may be shown as raw bytes.