61 Commits

Author SHA1 Message Date
68e91a62fd Clean a bit ZiQL parser and add condition check
Added a check at the end of parse Condition to check if the condition is
valid. Like < between string is not for example.

Also removed the state send_all to check inside filter_and_send instead
so update and delete can do the same.

And some small bugs/erros
2024-10-13 13:30:30 +02:00
e45d8579a9 Added the db metrics, use and init command
Now the DataEngine is deinit and init again everytime the user swap
database using db use. If a db is not selected, it will display an error
2024-10-13 11:26:44 +02:00
ac4186529d Implement send all and send JSON
Now send the JSON using the additional data.

Also implemented the empty filter and the no filter like GRAB User {}
and GRAB User
2024-10-12 19:02:23 +02:00
e6357c4d74 Added DELETE
Basically use the same function as update but instead of rewiting an
updated row, I skip it
2024-10-12 16:20:34 +02:00
94e3905009 Implemented UPDATE and error for ZiQL Parser
Created a function that is use by UPDATE to take a list of uuids and a
map of new value. Can be optimize later but work rn.

Also started to creat more proper error handeling with custom error
starting with ZiQLError
2024-10-12 15:26:24 +02:00
fbcca0dc09 Implemented dynamic schema
Started by doing a SchemaEngine but at the end I just put everything
inside the FileEngine.

Now you can use 'schema init path/to/schema' to initialize the struct
folders and first data file, Also save a copy of the schema in a file in
the ZipponDB folder.
2024-10-11 17:51:45 +02:00
e2e8bc4d80 Created the engines folder
Created a new folder to clean a bit the repo, put the file and schema
engine inside. As those and Parser depend on the types.zig, I also add
this folder inside the new engines folder
2024-10-09 23:32:37 +02:00
b008f434a6 Passed to one tabular file for each struct
Created a new Parser unique for the FileEngine to read each line.
It is slower as I need to parser character by character because their is
no fixed len for the data in files. Before I was just reading until the
end of the file.

Im gonna need to find some tricks to improve the parsing of data. I am
thinking using the stream directly instead of doing streamUntilDelimiter
2024-10-09 23:20:28 +02:00
cda2ee16a8 Passe basic GRAB query ! 2024-10-09 20:12:07 +02:00
67fb49ded5 Working new ZiQL pqrser for basic ADD and GRAB !!! 2024-10-08 00:18:25 +02:00
44e48a5276 Big rework - Now use global ziql parser - still buggy, need to debug the tests 2024-10-07 00:40:24 +02:00