database - Why I cant write dataframe in DB? -


i have 32 gb ram , use jupyter , pandas. dataframe isn't big, when want write in arctic data base have "memoryerror":

df_q.shape (157293660, 10) def memory(df):     mem = df.memory_usage(index=true).sum() / (1024 ** 3)     print(mem) memory(df_q) 12.8912200034 

and want write it:

from arctic import arctic import arctic arc store = arctic('.....') lib = store['mylib'] lib.write('quotes', df_q) 

memoryerror traceback (most recent call last) in () 1 memory(df_q) ----> 2 lib.write('quotes', df_q)

/usr/local/lib/python2.7/dist-packages/arctic/decorators.pyc in f_retry(*args, **kwargs) 48 while true: 49 try: ---> 50 return f(*args, **kwargs) 51 except (duplicatekeyerror, serverselectiontimeouterror) e: 52 # re-raise errors won't go away.

/usr/local/lib/python2.7/dist-packages/arctic/store/version_store.pyc in write(self, symbol, data, metadata, prune_previous_version, **kwargs) 561 562 handler = self._write_handler(version, symbol, data, **kwargs) --> 563 mongo_retry(handler.write)(self._arctic_lib, version, symbol, data, previous_version, **kwargs) 564 565 # insert new version version db

/usr/local/lib/python2.7/dist-packages/arctic/decorators.pyc in f_retry(*args, **kwargs) 48 while true: 49 try: ---> 50 return f(*args, **kwargs) 51 except (duplicatekeyerror, serverselectiontimeouterror) e: 52 # re-raise errors won't go away.

/usr/local/lib/python2.7/dist-packages/arctic/store/_pandas_ndarray_store.pyc in write(self, arctic_lib, version, symbol, item, previous_version) 301 def write(self, arctic_lib, version, symbol, item, previous_version): 302 item, md = self.to_records(item) --> 303 super(pandasdataframestore, self).write(arctic_lib, version, symbol, item, previous_version, dtype=md) 304 305 def append(self, arctic_lib, version, symbol, item, previous_version):

/usr/local/lib/python2.7/dist-packages/arctic/store/_ndarray_store.pyc in write(self, arctic_lib, version, symbol, item, previous_version, dtype) 385 version['type'] = self.type 386 version['up_to'] = len(item) --> 387 version['sha'] = self.checksum(item) 388 389 if previous_version:

/usr/local/lib/python2.7/dist-packages/arctic/store/_ndarray_store.pyc in checksum(self, item) 370 def checksum(self, item): 371 sha = hashlib.sha1() --> 372 sha.update(item.tostring()) 373 return binary(sha.digest()) 374

memoryerror:

wtf ? if use df_q.to_csv() wait years....

your issue not memory issue. if read errors, seems library having trouble accessing data...

1st error: says server has timed out. (serverselectiontimeouterror)

2nd error: trying update mongodb version.

3rd error: retries accessing server, fails.(serverselectiontimeouterror)

etc. problem lies in arctic package (see last error checksum error). can deduce fact df_q.to_csv() works, slow since not optimized artic. suggest trying reinstall arctic package


Comments

Popular posts from this blog

sql - invalid in the select list because it is not contained in either an aggregate function -

Angularjs unit testing - ng-disabled not working when adding text to textarea -

How to start daemon on android by adb -