Ok, I came up with a generic csv loader for Redis datatypes. it is plugged into the [puredis] object with the csv method, it receives a filepath and a datatype. Making it possible to build Strings, Lists, Hashes, Sets ans SortedSets from csv files.
I updates the pd-help files to match a well as the docs (the Readme). https://github.com/lp/puredis#readme
2011/7/26 Hans-Christoph Steiner hans@at.or.at
Where I could see it used it with large data sets like I did in this project: http://at.or.at/hans/**terrenatale/ http://at.or.at/hans/terrenatale/
The data is used to generate the large scale events like the dropping of coins for each country, the layout of static representing population, etc. I suppose that I would need to convert the data to a specific format for redis anyway, so just supporting CSV would be fine. I ended up using scripts and Pd to process a lot of the data. What would make puredis more interesting is if it meant that it would be easy to process large sets of data into something useful for generating sound and video.
.hc
On Jul 26, 2011, at 2:54 PM, Louis-Philippe wrote:
before jumping on the puredis-loader implementation, here is how I see the
specs, please tell me if I get what you think: ______________________________**_____ Strings:
string.csv: KEY, VALUE key1, value1 key2, value2
[lpuredis string string.csv]
-> creates a String for each line ______________________________**_____ List and Sets:
list.csv: KEY, VALUE1, VALUE2, ... key1, item1, item2, item3 key2, item1, item2, item3
[lpuredis list list.csv]
-> creates a List for each line
[lpuredis set list.csv]
-> creates a Set for each line ______________________________**_____ Sorted Sets:
zset.csv: KEY, VALUE1, SCORE1, VALUE2, SCORE2 key1, item1, 1, item2, 2, item3, 3 key2, item1, 4, item2, 5, item3, 6
[lpuredis zset zset.csv]
-> creates a Sorted Set for each line ______________________________**______ Hashes
hash.csv: KEY, KEY1, VALUE1, KEY2, VALUE2, KEY3, VALUE3 key1, hkey1, value1, hkey2, value2, hkey3, value3 key2, hkey1, value1, hkey2, value2, hkey3, value3
or even: KEY, FIELD1, FIELD2, FIELD3 key1, item1, item2, item3 key2, item1, item2, item3
[lpuredis hash hash.csv]
-> creates a Hash for each line ______________________________**_________________________
For SQL file I am clueless, but I guess it will be similar... JSON would be the easiest to translate but would it be of any use to pd users?
L-P
------------------------------**------------------------------**
If you are not part of the solution, you are part of the problem.