#1957: v.in.ascii (points.c) does not import some numbers in attached example
------------------------+---------------------------------------------------
Reporter: ychemin | Owner: grass-dev@…
Type: defect | Status: new
Priority: normal | Milestone: 6.5.0
Component: Vector | Version: svn-trunk
Keywords: v.in.ascii | Platform: Linux
Cpu: x86-64 |
------------------------+---------------------------------------------------
Changes (by hamish):
* milestone: 7.0.0 => 6.5.0
Comment:
(g6 filenames)
in vector/v.in.ascii/points.c change buflen from 4000 to 24000 or some
large enough value. consider also to increase BUFFSIZE from 128 in a2b.c
and `char buf[1000];` in in.c.
for sqlite: "`The default setting for SQLITE_MAX_COLUMN is 2000. You can
change it at compile time to values as large as 32767.`" http://www.sqlite.org/limits.html
how many met stations? if less than the number of time records maybe
consider to invert the array and organize data by time instead of
position? or split up into multiple files by e.g. year?
#1957: v.in.ascii (points.c) does not import some numbers in attached example
------------------------+---------------------------------------------------
Reporter: ychemin | Owner: grass-dev@…
Type: defect | Status: new
Priority: normal | Milestone: 6.5.0
Component: Vector | Version: svn-trunk
Keywords: v.in.ascii | Platform: Linux
Cpu: x86-64 |
------------------------+---------------------------------------------------
Comment(by hamish):
probably best to write a script to create one vector map per row, with
columns 3-inf rotated into a single column; and the timestamps in a second
column, matching the data by row number.
#1957: v.in.ascii (points.c) does not import some numbers in attached example
------------------------+---------------------------------------------------
Reporter: ychemin | Owner: grass-dev@…
Type: defect | Status: new
Priority: normal | Milestone: 6.5.0
Component: Vector | Version: svn-trunk
Keywords: v.in.ascii | Platform: Linux
Cpu: x86-64 |
------------------------+---------------------------------------------------
Comment(by ychemin):
Replying to [ticket:1957 ychemin]:
> v.in.ascii input=9.csv output=rain_$(echo 9.csv | sed 's/\.csv//g')
separator=comma
>
> Real number of columns: 4020
> -----From v.in.ascii-----------
> Maximum number of columns: 1366
> Minimum number of columns: 1317
Merge Ticket 1958:
importing .csv made of several rows of 4020 columns each, import stops at
row 18, looking into the file, it is near the 20000 character.
Same behavior in 6.4.2 (Ubuntu stable version)
After setting (points.c):
76 buflen = 50000;
then error becomes:
Number of columns: 4020 <- THIS IS GOOD
DBMI-SQLite driver error:
Error in sqlite3_prepare():
too many columns on rain_9 <-
> then error becomes:
>
> Number of columns: 4020 <- THIS IS GOOD
> DBMI-SQLite driver error:
> Error in sqlite3_prepare():
> too many columns on rain_9 <-
As mentioned, SQLite doesn't support more than 2000 columns out of the
box, for hints see
#1957: v.in.ascii (points.c) does not import some numbers in attached example
----------------------+-----------------------------------------------------
Reporter: ychemin | Owner: grass-dev@…
Type: defect | Status: closed
Priority: normal | Milestone: 6.5.0
Component: Vector | Version: svn-trunk
Resolution: fixed | Keywords: v.in.ascii
Platform: Linux | Cpu: x86-64
----------------------+-----------------------------------------------------
Comment(by hamish):
It is likely that even 255 columns will not be supported by grass, just
because the buffers are too small. Better to make the bottleneck the DB
backend not the GRASS frontend.
As a ballpark estimate, say 2 columns x,y at 10 chars wide, + 253 columns
of varchar(255), with field seps, plus a DOS newline,
10 + 1 + 10 + 1 + 253*255 + 252 + 2 = 64791
I think it's also worth about the solutions of transposing the array and
creating a script to make each data row its own map, with the constant-
step time series as a single column not a series of individual rows.. work
around the DB limitation by thinking of the problem in a different way..