I'm try to run autofaiss build_index
as follows,
[nix-shell:~]$ autofaiss build_index ./deleteme/ --file_format=parquet
2022-12-15 23:23:02,143 [INFO]: Using 32 omp threads (processes), consider increasing --nb_cores if you have more
2022-12-15 23:23:02,144 [INFO]: Launching the whole pipeline 12/15/2022, 23:23:02
2022-12-15 23:23:02,144 [INFO]: Reading total number of vectors and dimension 12/15/2022, 23:23:02
2022-12-15 23:23:02,146 [INFO]: >>> Finished "Reading total number of vectors and dimension" in 0.0028 secs
2022-12-15 23:23:02,147 [INFO]: >>> Finished "Launching the whole pipeline" in 0.0030 secs
Traceback (most recent call last):
File "/nix/store/0cnbvzcbn02najv78fsqvvjivgy4dpkk-python3.10-autofaiss-2.15.3/bin/.autofaiss-wrapped", line 9, in <module>
sys.exit(main())
File "/nix/store/0cnbvzcbn02najv78fsqvvjivgy4dpkk-python3.10-autofaiss-2.15.3/lib/python3.10/site-packages/autofaiss/external/quantize.py", line 596, in main
fire.Fire(
File "/nix/store/801g89pidv78hqddvp29r08h1ji62bqk-python3.10-fire-0.4.0/lib/python3.10/site-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/nix/store/801g89pidv78hqddvp29r08h1ji62bqk-python3.10-fire-0.4.0/lib/python3.10/site-packages/fire/core.py", line 466, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/nix/store/801g89pidv78hqddvp29r08h1ji62bqk-python3.10-fire-0.4.0/lib/python3.10/site-packages/fire/core.py", line 681, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/nix/store/0cnbvzcbn02najv78fsqvvjivgy4dpkk-python3.10-autofaiss-2.15.3/lib/python3.10/site-packages/autofaiss/external/quantize.py", line 205, in build_index
embedding_reader = EmbeddingReader(
File "/nix/store/5gjs659k7bjza921kajn3vikgghkz5dk-python3.10-embedding-reader-1.5.0/lib/python3.10/site-packages/embedding_reader/embedding_reader.py", line 22, in __init__
self.reader = ParquetReader(
File "/nix/store/5gjs659k7bjza921kajn3vikgghkz5dk-python3.10-embedding-reader-1.5.0/lib/python3.10/site-packages/embedding_reader/parquet_reader.py", line 68, in __init__
raise ValueError(f"No embeddings found in folder {embeddings_folder}")
ValueError: No embeddings found in folder ./deleteme/
but I do have embeddings in ./deleteme/
!
[nix-shell:~]$ ls ./deleteme/
deleteme.parquet-00000-of-00001
Furthermore, this parquet file parses just fine and matches the column names expected by autofaiss:
In [14]: pq.read_table("./deleteme/deleteme.parquet-00000-of-00001")
Out[14]:
pyarrow.Table
vin: string
timestamp: int64
camera: string
bbox: fixed_size_list<item: uint16>[4]
child 0, item: uint16
id: int64
embedding: fixed_size_list<item: float>[768]
child 0, item: float
----
vin: [["XX4L4100140","XX4L4100140","XX4L4100140","XX4L4100140","XX9L4100103",...,"XXXL4100076","XXXL4100076","XXXL4100076","XXXL4100076","XXXL4100076"]]
timestamp: [[1641009004,1641009004,1641009004,1641009004,1640995845,...,1641002256,1641002256,1641002256,1641002256,1641002256]]
camera: [["camera_back_left","camera_back_left","camera_back_left","camera_back_left","camera_rear_medium",...,"camera_front_left_80","camera_front_left_80","camera_front_left_80","camera_front_left_80","camera_front_left_80"]]
bbox: [[[1476,405,1824,839],[269,444,632,637],...,[826,377,981,492],[1194,404,1480,587]]]
id: [[-8209940914704430861,-8874558295300428965,6706661532224839957,-8984308169583777616,1311470225947591668,...,-8769893754771418171,-8253568985418968059,-6239971725986942111,7715533091743341224,2502116624477591343]]
embedding: [[[-0.015306762,0.054586615,0.022397395,0.008673363,-0.0064821607,...,-0.023860542,0.032048535,-0.029431753,0.012359367,-0.022298913],[-0.006019405,0.04093461,0.010485844,0.00063089275,0.023878522,...,0.018967431,0.006789252,-0.01607387,-0.0037895043,0.009490352],...,[0.009580072,0.06454213,-0.0065298285,0.017814448,0.026221843,...,0.032834977,0.0094326865,-0.007913973,-0.009541624,-0.0115858],[0.009568084,0.057270113,-0.0055452115,0.008511255,0.019073263,...,0.0302203,0.009586956,0.0019548207,0.00042776446,0.0094863055]]]
What's going wrong here? How can I create an index out of a parquet dataset?