ouijaflow

Input saved to /scratch/irc/personal/robrechtc/tmp//Rtmp8YHvkj/input:
expression.csv
features_id.json
params.json
1/1000 [ 0%] ETA: 4554s | Loss: 272829.500
10/1000 [ 1%] ETA: 454s | Loss: 89267.789 
20/1000 [ 2%] ETA: 226s | Loss: 39191.762
30/1000 [ 3%] ETA: 150s | Loss: 38486.055
40/1000 [ 4%] █ ETA: 112s | Loss: 33988.266
50/1000 [ 5%] █ ETA: 90s | Loss: 33155.289 
60/1000 [ 6%] █ ETA: 74s | Loss: 32718.965
70/1000 [ 7%] ██ ETA: 63s | Loss: 31946.432
80/1000 [ 8%] ██ ETA: 55s | Loss: 31806.383
90/1000 [ 9%] ██ ETA: 49s | Loss: 31151.818
100/1000 [ 10%] ███ ETA: 44s | Loss: 31070.975
110/1000 [ 11%] ███ ETA: 39s | Loss: 29427.670
120/1000 [ 12%] ███ ETA: 36s | Loss: 30231.346
130/1000 [ 13%] ███ ETA: 33s | Loss: 30054.240
140/1000 [ 14%] ████ ETA: 30s | Loss: 30584.148
150/1000 [ 15%] ████ ETA: 28s | Loss: 29231.088
160/1000 [ 16%] ████ ETA: 26s | Loss: 30601.516
170/1000 [ 17%] █████ ETA: 25s | Loss: 29567.352
180/1000 [ 18%] █████ ETA: 23s | Loss: 28441.174
190/1000 [ 19%] █████ ETA: 22s | Loss: 28414.574
200/1000 [ 20%] ██████ ETA: 20s | Loss: 27573.467
210/1000 [ 21%] ██████ ETA: 19s | Loss: 26613.949
220/1000 [ 22%] ██████ ETA: 18s | Loss: 26665.525
230/1000 [ 23%] ██████ ETA: 17s | Loss: 27013.279
240/1000 [ 24%] ███████ ETA: 16s | Loss: 27595.004
250/1000 [ 25%] ███████ ETA: 16s | Loss: 25708.133
260/1000 [ 26%] ███████ ETA: 15s | Loss: 25280.590
270/1000 [ 27%] ████████ ETA: 14s | Loss: 25270.111
280/1000 [ 28%] ████████ ETA: 14s | Loss: 24984.266
290/1000 [ 28%] ████████ ETA: 13s | Loss: 25134.967
300/1000 [ 30%] █████████ ETA: 12s | Loss: 25008.084
310/1000 [ 31%] █████████ ETA: 12s | Loss: 24723.969
320/1000 [ 32%] █████████ ETA: 11s | Loss: 24817.832
330/1000 [ 33%] █████████ ETA: 11s | Loss: 24681.158
340/1000 [ 34%] ██████████ ETA: 10s | Loss: 24659.365
350/1000 [ 35%] ██████████ ETA: 10s | Loss: 24602.582
360/1000 [ 36%] ██████████ ETA: 10s | Loss: 24298.748
370/1000 [ 37%] ███████████ ETA: 9s | Loss: 24338.100 
380/1000 [ 38%] ███████████ ETA: 9s | Loss: 24401.475
390/1000 [ 39%] ███████████ ETA: 9s | Loss: 24257.330
400/1000 [ 40%] ████████████ ETA: 8s | Loss: 41772.672
410/1000 [ 41%] ████████████ ETA: 8s | Loss: 24372.736
420/1000 [ 42%] ████████████ ETA: 8s | Loss: 24292.328
430/1000 [ 43%] ████████████ ETA: 7s | Loss: 24240.516
440/1000 [ 44%] █████████████ ETA: 7s | Loss: 24273.875
450/1000 [ 45%] █████████████ ETA: 7s | Loss: 24205.646
460/1000 [ 46%] █████████████ ETA: 7s | Loss: 24220.467
470/1000 [ 47%] ██████████████ ETA: 6s | Loss: 24037.684
480/1000 [ 48%] ██████████████ ETA: 6s | Loss: 24038.074
490/1000 [ 49%] ██████████████ ETA: 6s | Loss: 24004.188
500/1000 [ 50%] ███████████████ ETA: 6s | Loss: 24107.895
510/1000 [ 51%] ███████████████ ETA: 6s | Loss: 23974.068
520/1000 [ 52%] ███████████████ ETA: 5s | Loss: 23892.961
530/1000 [ 53%] ███████████████ ETA: 5s | Loss: 23962.949
540/1000 [ 54%] ████████████████ ETA: 5s | Loss: 23874.666
550/1000 [ 55%] ████████████████ ETA: 5s | Loss: 23742.852
560/1000 [ 56%] ████████████████ ETA: 5s | Loss: 23676.340
570/1000 [ 56%] █████████████████ ETA: 4s | Loss: 23827.105
580/1000 [ 57%] █████████████████ ETA: 4s | Loss: 23710.074
590/1000 [ 59%] █████████████████ ETA: 4s | Loss: 23588.678
600/1000 [ 60%] ██████████████████ ETA: 4s | Loss: 23446.922
610/1000 [ 61%] ██████████████████ ETA: 4s | Loss: 23477.238
620/1000 [ 62%] ██████████████████ ETA: 4s | Loss: 23455.639
630/1000 [ 63%] ██████████████████ ETA: 3s | Loss: 23329.471
640/1000 [ 64%] ███████████████████ ETA: 3s | Loss: 23365.305
650/1000 [ 65%] ███████████████████ ETA: 3s | Loss: 23255.051
660/1000 [ 66%] ███████████████████ ETA: 3s | Loss: 23319.443
670/1000 [ 67%] ████████████████████ ETA: 3s | Loss: 23284.967
680/1000 [ 68%] ████████████████████ ETA: 3s | Loss: 23145.512
690/1000 [ 69%] ████████████████████ ETA: 3s | Loss: 23256.322
700/1000 [ 70%] █████████████████████ ETA: 3s | Loss: 23131.637
710/1000 [ 71%] █████████████████████ ETA: 2s | Loss: 23069.738
720/1000 [ 72%] █████████████████████ ETA: 2s | Loss: 23254.076
730/1000 [ 73%] █████████████████████ ETA: 2s | Loss: 23137.486
740/1000 [ 74%] ██████████████████████ ETA: 2s | Loss: 23100.697
750/1000 [ 75%] ██████████████████████ ETA: 2s | Loss: 23082.904
760/1000 [ 76%] ██████████████████████ ETA: 2s | Loss: 23034.990
770/1000 [ 77%] ███████████████████████ ETA: 2s | Loss: 23060.732
780/1000 [ 78%] ███████████████████████ ETA: 2s | Loss: 23075.660
790/1000 [ 79%] ███████████████████████ ETA: 1s | Loss: 23067.676
800/1000 [ 80%] ████████████████████████ ETA: 1s | Loss: 23006.402
810/1000 [ 81%] ████████████████████████ ETA: 1s | Loss: 23030.992
820/1000 [ 82%] ████████████████████████ ETA: 1s | Loss: 22998.879
830/1000 [ 83%] ████████████████████████ ETA: 1s | Loss: 22996.268
840/1000 [ 84%] █████████████████████████ ETA: 1s | Loss: 22970.148
850/1000 [ 85%] █████████████████████████ ETA: 1s | Loss: 22980.168
860/1000 [ 86%] █████████████████████████ ETA: 1s | Loss: 22968.238
870/1000 [ 87%] ██████████████████████████ ETA: 1s | Loss: 22997.271
880/1000 [ 88%] ██████████████████████████ ETA: 1s | Loss: 22949.416
890/1000 [ 89%] ██████████████████████████ ETA: 0s | Loss: 22911.260
900/1000 [ 90%] ███████████████████████████ ETA: 0s | Loss: 22808.012
910/1000 [ 91%] ███████████████████████████ ETA: 0s | Loss: 22965.268
920/1000 [ 92%] ███████████████████████████ ETA: 0s | Loss: 22841.162
930/1000 [ 93%] ███████████████████████████ ETA: 0s | Loss: 22932.531
940/1000 [ 94%] ████████████████████████████ ETA: 0s | Loss: 22919.521
950/1000 [ 95%] ████████████████████████████ ETA: 0s | Loss: 22782.648
960/1000 [ 96%] ████████████████████████████ ETA: 0s | Loss: 22846.572
970/1000 [ 97%] █████████████████████████████ ETA: 0s | Loss: 22870.869
980/1000 [ 98%] █████████████████████████████ ETA: 0s | Loss: 22884.713
990/1000 [ 99%] █████████████████████████████ ETA: 0s | Loss: 22911.047
1000/1000 [100%] ██████████████████████████████ Elapsed: 8s | Loss: 22837.070/usr/local/lib/python3.6/site-packages/edward/util/random_variables.py:52: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
not np.issubdtype(value.dtype, np.float) and \
output saved in /scratch/irc/personal/robrechtc/tmp//Rtmp8YHvkj/output:
pseudotime.csv
timings.json
Input saved to /scratch/irc/personal/robrechtc/tmp//RtmpjArGRi/input:
expression.csv
features_id.json
params.json
1/1000 [ 0%] ETA: 4335s | Loss: 220247.234
10/1000 [ 1%] ETA: 433s | Loss: 80199.969 
20/1000 [ 2%] ETA: 216s | Loss: 39324.746
30/1000 [ 3%] ETA: 144s | Loss: 35321.762
40/1000 [ 4%] █ ETA: 108s | Loss: 33135.660
50/1000 [ 5%] █ ETA: 86s | Loss: 35801.949 
60/1000 [ 6%] █ ETA: 72s | Loss: 32955.688
70/1000 [ 7%] ██ ETA: 61s | Loss: 31777.812
80/1000 [ 8%] ██ ETA: 53s | Loss: 32384.266
90/1000 [ 9%] ██ ETA: 47s | Loss: 32269.113
100/1000 [ 10%] ███ ETA: 42s | Loss: 32443.926
110/1000 [ 11%] ███ ETA: 38s | Loss: 33845.762
120/1000 [ 12%] ███ ETA: 35s | Loss: 29193.918
130/1000 [ 13%] ███ ETA: 32s | Loss: 31275.402
140/1000 [ 14%] ████ ETA: 30s | Loss: 30046.260
150/1000 [ 15%] ████ ETA: 28s | Loss: 30228.799
160/1000 [ 16%] ████ ETA: 26s | Loss: 31307.686
170/1000 [ 17%] █████ ETA: 24s | Loss: 29370.699
180/1000 [ 18%] █████ ETA: 23s | Loss: 29053.174
190/1000 [ 19%] █████ ETA: 21s | Loss: 28143.955
200/1000 [ 20%] ██████ ETA: 20s | Loss: 29453.707
210/1000 [ 21%] ██████ ETA: 19s | Loss: 28440.002
220/1000 [ 22%] ██████ ETA: 18s | Loss: 28568.973
230/1000 [ 23%] ██████ ETA: 17s | Loss: 29525.746
240/1000 [ 24%] ███████ ETA: 16s | Loss: 28771.949
250/1000 [ 25%] ███████ ETA: 16s | Loss: 26779.852
260/1000 [ 26%] ███████ ETA: 15s | Loss: 27663.861
270/1000 [ 27%] ████████ ETA: 14s | Loss: 27918.996
280/1000 [ 28%] ████████ ETA: 14s | Loss: 27099.566
290/1000 [ 28%] ████████ ETA: 13s | Loss: 26357.834
300/1000 [ 30%] █████████ ETA: 12s | Loss: 26529.889
310/1000 [ 31%] █████████ ETA: 12s | Loss: 26369.754
320/1000 [ 32%] █████████ ETA: 11s | Loss: 25965.369
330/1000 [ 33%] █████████ ETA: 11s | Loss: 25827.596
340/1000 [ 34%] ██████████ ETA: 10s | Loss: 26389.994
350/1000 [ 35%] ██████████ ETA: 10s | Loss: 26062.416
360/1000 [ 36%] ██████████ ETA: 10s | Loss: 26443.893
370/1000 [ 37%] ███████████ ETA: 9s | Loss: 26304.945 
380/1000 [ 38%] ███████████ ETA: 9s | Loss: 25777.945
390/1000 [ 39%] ███████████ ETA: 9s | Loss: 25795.189
400/1000 [ 40%] ████████████ ETA: 8s | Loss: 25863.418
410/1000 [ 41%] ████████████ ETA: 8s | Loss: 25674.268
420/1000 [ 42%] ████████████ ETA: 8s | Loss: 25726.396
430/1000 [ 43%] ████████████ ETA: 8s | Loss: 25663.201
440/1000 [ 44%] █████████████ ETA: 7s | Loss: 25857.271
450/1000 [ 45%] █████████████ ETA: 7s | Loss: 25430.520
460/1000 [ 46%] █████████████ ETA: 7s | Loss: 25748.410
470/1000 [ 47%] ██████████████ ETA: 7s | Loss: 25416.598
480/1000 [ 48%] ██████████████ ETA: 6s | Loss: 25444.219
490/1000 [ 49%] ██████████████ ETA: 6s | Loss: 25424.000
500/1000 [ 50%] ███████████████ ETA: 6s | Loss: 25508.814
510/1000 [ 51%] ███████████████ ETA: 6s | Loss: 25500.668
520/1000 [ 52%] ███████████████ ETA: 6s | Loss: 25663.992
530/1000 [ 53%] ███████████████ ETA: 5s | Loss: 25623.762
540/1000 [ 54%] ████████████████ ETA: 5s | Loss: 25610.434
550/1000 [ 55%] ████████████████ ETA: 5s | Loss: 25456.848
560/1000 [ 56%] ████████████████ ETA: 5s | Loss: 25555.248
570/1000 [ 56%] █████████████████ ETA: 5s | Loss: 25466.910
580/1000 [ 57%] █████████████████ ETA: 4s | Loss: 26585.910
590/1000 [ 59%] █████████████████ ETA: 4s | Loss: 25169.693
600/1000 [ 60%] ██████████████████ ETA: 4s | Loss: 25758.250
610/1000 [ 61%] ██████████████████ ETA: 4s | Loss: 25294.031
620/1000 [ 62%] ██████████████████ ETA: 4s | Loss: 25288.848
630/1000 [ 63%] ██████████████████ ETA: 4s | Loss: 25339.639
640/1000 [ 64%] ███████████████████ ETA: 3s | Loss: 25233.006
650/1000 [ 65%] ███████████████████ ETA: 3s | Loss: 25530.678
660/1000 [ 66%] ███████████████████ ETA: 3s | Loss: 25268.348
670/1000 [ 67%] ████████████████████ ETA: 3s | Loss: 25312.344
680/1000 [ 68%] ████████████████████ ETA: 3s | Loss: 24631.113
690/1000 [ 69%] ████████████████████ ETA: 3s | Loss: 24944.443
700/1000 [ 70%] █████████████████████ ETA: 3s | Loss: 25332.420
710/1000 [ 71%] █████████████████████ ETA: 2s | Loss: 30278.688
720/1000 [ 72%] █████████████████████ ETA: 2s | Loss: 26024.258
730/1000 [ 73%] █████████████████████ ETA: 2s | Loss: 25322.057
740/1000 [ 74%] ██████████████████████ ETA: 2s | Loss: 25214.131
750/1000 [ 75%] ██████████████████████ ETA: 2s | Loss: 25273.094
760/1000 [ 76%] ██████████████████████ ETA: 2s | Loss: 25275.066
770/1000 [ 77%] ███████████████████████ ETA: 2s | Loss: 25194.631
780/1000 [ 78%] ███████████████████████ ETA: 2s | Loss: 25106.924
790/1000 [ 79%] ███████████████████████ ETA: 2s | Loss: 25358.232
800/1000 [ 80%] ████████████████████████ ETA: 1s | Loss: 25135.967
810/1000 [ 81%] ████████████████████████ ETA: 1s | Loss: 25209.674
820/1000 [ 82%] ████████████████████████ ETA: 1s | Loss: 25172.041
830/1000 [ 83%] ████████████████████████ ETA: 1s | Loss: 24994.328
840/1000 [ 84%] █████████████████████████ ETA: 1s | Loss: 25171.236
850/1000 [ 85%] █████████████████████████ ETA: 1s | Loss: 25201.738
860/1000 [ 86%] █████████████████████████ ETA: 1s | Loss: 25267.273
870/1000 [ 87%] ██████████████████████████ ETA: 1s | Loss: 25116.578
880/1000 [ 88%] ██████████████████████████ ETA: 1s | Loss: 25106.980
890/1000 [ 89%] ██████████████████████████ ETA: 0s | Loss: 25093.020
900/1000 [ 90%] ███████████████████████████ ETA: 0s | Loss: 25088.186
910/1000 [ 91%] ███████████████████████████ ETA: 0s | Loss: 25081.359
920/1000 [ 92%] ███████████████████████████ ETA: 0s | Loss: 25107.049
930/1000 [ 93%] ███████████████████████████ ETA: 0s | Loss: 25123.875
940/1000 [ 94%] ████████████████████████████ ETA: 0s | Loss: 25191.576
950/1000 [ 95%] ████████████████████████████ ETA: 0s | Loss: 25054.064
960/1000 [ 96%] ████████████████████████████ ETA: 0s | Loss: 25032.432
970/1000 [ 97%] █████████████████████████████ ETA: 0s | Loss: 25060.434
980/1000 [ 98%] █████████████████████████████ ETA: 0s | Loss: 25074.451
990/1000 [ 99%] █████████████████████████████ ETA: 0s | Loss: 25109.863
1000/1000 [100%] ██████████████████████████████ Elapsed: 8s | Loss: 25035.576/usr/local/lib/python3.6/site-packages/edward/util/random_variables.py:52: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
not np.issubdtype(value.dtype, np.float) and \
output saved in /scratch/irc/personal/robrechtc/tmp//RtmpjArGRi/output:
pseudotime.csv
timings.json
Input saved to /scratch/irc/personal/robrechtc/tmp//RtmpcFkqcc/input:
expression.csv
features_id.json
params.json
1/1000 [ 0%] ETA: 5667s | Loss: 3275300.500
10/1000 [ 1%] ETA: 604s | Loss: 2128586.750 2018-08-03 16:43:40.855330: W tensorflow/core/framework/op_kernel.cc:1202] OP_REQUIRES failed at pack_op.cc:88 : Resource exhausted: OOM when allocating tensor with shape[2,238,2429] and type float on /job:localhost/replica:0/task:0/device:CPU:0 by allocator cpu
/usr/local/lib/python3.6/site-packages/edward/util/random_variables.py:52: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
not np.issubdtype(value.dtype, np.float) and \
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1361, in _do_call
return fn(*args)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1340, in _run_fn
target_list, status, run_metadata)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/errors_impl.py", line 516, in __exit__
c_api.TF_GetCode(self.status.status))
tensorflow.python.framework.errors_impl.ResourceExhaustedError: OOM when allocating tensor with shape[2,238,2429] and type float on /job:localhost/replica:0/task:0/device:CPU:0 by allocator cpu
[[Node: inference/sample/DropoutNormal_1/log_prob/stack = Pack[N=2, T=DT_FLOAT, axis=0, _device="/job:localhost/replica:0/task:0/device:CPU:0"](inference/sample/DropoutNormal_1/log_prob/LogSigmoid_1, inference/sample/DropoutNormal_1/log_prob/add_2)]]
Hint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/code/run.py", line 26, in
oui.fit(expression.values, n_iter=int(p["iter"]))
File "/usr/local/lib/python3.6/site-packages/ouijaflow/ouija.py", line 31, in fit
inference.run(n_iter = n_iter, logdir = logdir)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/inference.py", line 146, in run
info_dict = self.update()
File "/usr/local/lib/python3.6/site-packages/edward/inferences/variational_inference.py", line 154, in update
_, t, loss = sess.run([self.train, self.increment_t, self.loss], feed_dict)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 905, in run
run_metadata_ptr)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1137, in _run
feed_dict_tensor, options, run_metadata)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1355, in _do_run
options, run_metadata)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1374, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.ResourceExhaustedError: OOM when allocating tensor with shape[2,238,2429] and type float on /job:localhost/replica:0/task:0/device:CPU:0 by allocator cpu
[[Node: inference/sample/DropoutNormal_1/log_prob/stack = Pack[N=2, T=DT_FLOAT, axis=0, _device="/job:localhost/replica:0/task:0/device:CPU:0"](inference/sample/DropoutNormal_1/log_prob/LogSigmoid_1, inference/sample/DropoutNormal_1/log_prob/add_2)]]
Hint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.
Caused by op 'inference/sample/DropoutNormal_1/log_prob/stack', defined at:
File "/code/run.py", line 26, in
oui.fit(expression.values, n_iter=int(p["iter"]))
File "/usr/local/lib/python3.6/site-packages/ouijaflow/ouija.py", line 31, in fit
inference.run(n_iter = n_iter, logdir = logdir)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/inference.py", line 125, in run
self.initialize(*args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/klqp.py", line 110, in initialize
return super(KLqp, self).initialize(*args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/variational_inference.py", line 68, in initialize
self.loss, grads_and_vars = self.build_loss_and_gradients(var_list)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/klqp.py", line 149, in build_loss_and_gradients
return build_reparam_loss_and_gradients(self, var_list)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/klqp.py", line 663, in build_reparam_loss_and_gradients
inference.scale.get(x, 1.0) * x_copy.log_prob(dict_swap[x]))
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/distributions/distribution.py", line 716, in log_prob
return self._call_log_prob(value, name)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/distributions/distribution.py", line 698, in _call_log_prob
return self._log_prob(value, **kwargs)
File "/usr/local/lib/python3.6/site-packages/ouijaflow/dropout_normal.py", line 68, in _log_prob
log_p_zero = tf.reduce_logsumexp(tf.stack([lp_drop, log_p_truezero]), axis = 0)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/array_ops.py", line 928, in stack
return ops.convert_to_tensor(values, name=name)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 946, in convert_to_tensor
as_ref=False)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1036, in internal_convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/array_ops.py", line 1020, in _autopacking_conversion_function
return _autopacking_helper(v, inferred_dtype, name or "packed")
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/array_ops.py", line 983, in _autopacking_helper
return gen_array_ops._pack(elems_as_tensors, name=scope)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/gen_array_ops.py", line 2839, in _pack
"Pack", values=values, axis=axis, name=name)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 787, in _apply_op_helper
op_def=op_def)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3271, in create_op
op_def=op_def)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1650, in __init__
self._traceback = self._graph._extract_stack() # pylint: disable=protected-access
ResourceExhaustedError (see above for traceback): OOM when allocating tensor with shape[2,238,2429] and type float on /job:localhost/replica:0/task:0/device:CPU:0 by allocator cpu
[[Node: inference/sample/DropoutNormal_1/log_prob/stack = Pack[N=2, T=DT_FLOAT, axis=0, _device="/job:localhost/replica:0/task:0/device:CPU:0"](inference/sample/DropoutNormal_1/log_prob/LogSigmoid_1, inference/sample/DropoutNormal_1/log_prob/add_2)]]
Hint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.
Input saved to /scratch/irc/personal/robrechtc/tmp//RtmpPDl2Cp/input:
expression.csv
features_id.json
params.json
2018-08-03 16:44:06.936994: W tensorflow/core/framework/op_kernel.cc:1202] OP_REQUIRES failed at pack_op.cc:88 : Resource exhausted: OOM when allocating tensor with shape[2,355,12789] and type float on /job:localhost/replica:0/task:0/device:CPU:0 by allocator cpu
/usr/local/lib/python3.6/site-packages/edward/util/random_variables.py:52: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
not np.issubdtype(value.dtype, np.float) and \
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1361, in _do_call
return fn(*args)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1340, in _run_fn
target_list, status, run_metadata)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/errors_impl.py", line 516, in __exit__
c_api.TF_GetCode(self.status.status))
tensorflow.python.framework.errors_impl.ResourceExhaustedError: OOM when allocating tensor with shape[2,355,12789] and type float on /job:localhost/replica:0/task:0/device:CPU:0 by allocator cpu
[[Node: inference/sample/DropoutNormal_1/log_prob/stack = Pack[N=2, T=DT_FLOAT, axis=0, _device="/job:localhost/replica:0/task:0/device:CPU:0"](inference/sample/DropoutNormal_1/log_prob/LogSigmoid_1, inference/sample/DropoutNormal_1/log_prob/add_2)]]
Hint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/code/run.py", line 26, in
oui.fit(expression.values, n_iter=int(p["iter"]))
File "/usr/local/lib/python3.6/site-packages/ouijaflow/ouija.py", line 31, in fit
inference.run(n_iter = n_iter, logdir = logdir)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/inference.py", line 146, in run
info_dict = self.update()
File "/usr/local/lib/python3.6/site-packages/edward/inferences/variational_inference.py", line 154, in update
_, t, loss = sess.run([self.train, self.increment_t, self.loss], feed_dict)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 905, in run
run_metadata_ptr)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1137, in _run
feed_dict_tensor, options, run_metadata)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1355, in _do_run
options, run_metadata)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1374, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.ResourceExhaustedError: OOM when allocating tensor with shape[2,355,12789] and type float on /job:localhost/replica:0/task:0/device:CPU:0 by allocator cpu
[[Node: inference/sample/DropoutNormal_1/log_prob/stack = Pack[N=2, T=DT_FLOAT, axis=0, _device="/job:localhost/replica:0/task:0/device:CPU:0"](inference/sample/DropoutNormal_1/log_prob/LogSigmoid_1, inference/sample/DropoutNormal_1/log_prob/add_2)]]
Hint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.
Caused by op 'inference/sample/DropoutNormal_1/log_prob/stack', defined at:
File "/code/run.py", line 26, in
oui.fit(expression.values, n_iter=int(p["iter"]))
File "/usr/local/lib/python3.6/site-packages/ouijaflow/ouija.py", line 31, in fit
inference.run(n_iter = n_iter, logdir = logdir)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/inference.py", line 125, in run
self.initialize(*args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/klqp.py", line 110, in initialize
return super(KLqp, self).initialize(*args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/variational_inference.py", line 68, in initialize
self.loss, grads_and_vars = self.build_loss_and_gradients(var_list)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/klqp.py", line 149, in build_loss_and_gradients
return build_reparam_loss_and_gradients(self, var_list)
File "/usr/local/lib/python3.6/site-packages/edward/inferences/klqp.py", line 663, in build_reparam_loss_and_gradients
inference.scale.get(x, 1.0) * x_copy.log_prob(dict_swap[x]))
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/distributions/distribution.py", line 716, in log_prob
return self._call_log_prob(value, name)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/distributions/distribution.py", line 698, in _call_log_prob
return self._log_prob(value, **kwargs)
File "/usr/local/lib/python3.6/site-packages/ouijaflow/dropout_normal.py", line 68, in _log_prob
log_p_zero = tf.reduce_logsumexp(tf.stack([lp_drop, log_p_truezero]), axis = 0)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/array_ops.py", line 928, in stack
return ops.convert_to_tensor(values, name=name)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 946, in convert_to_tensor
as_ref=False)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1036, in internal_convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/array_ops.py", line 1020, in _autopacking_conversion_function
return _autopacking_helper(v, inferred_dtype, name or "packed")
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/array_ops.py", line 983, in _autopacking_helper
return gen_array_ops._pack(elems_as_tensors, name=scope)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/gen_array_ops.py", line 2839, in _pack
"Pack", values=values, axis=axis, name=name)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 787, in _apply_op_helper
op_def=op_def)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3271, in create_op
op_def=op_def)
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1650, in __init__
self._traceback = self._graph._extract_stack() # pylint: disable=protected-access
ResourceExhaustedError (see above for traceback): OOM when allocating tensor with shape[2,355,12789] and type float on /job:localhost/replica:0/task:0/device:CPU:0 by allocator cpu
[[Node: inference/sample/DropoutNormal_1/log_prob/stack = Pack[N=2, T=DT_FLOAT, axis=0, _device="/job:localhost/replica:0/task:0/device:CPU:0"](inference/sample/DropoutNormal_1/log_prob/LogSigmoid_1, inference/sample/DropoutNormal_1/log_prob/add_2)]]
Hint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.
Input saved to /scratch/irc/personal/robrechtc/tmp//RtmpaIKkMM/input:
expression.csv
features_id.json
params.json
1/1000 [ 0%] ETA: 2492s | Loss: 201380.781
10/1000 [ 1%] ETA: 249s | Loss: 109446.969 
20/1000 [ 2%] ETA: 125s | Loss: 43067.531 
30/1000 [ 3%] ETA: 83s | Loss: 33941.805 
40/1000 [ 4%] █ ETA: 62s | Loss: 33225.637
50/1000 [ 5%] █ ETA: 50s | Loss: 32499.711
60/1000 [ 6%] █ ETA: 41s | Loss: 34389.199
70/1000 [ 7%] ██ ETA: 35s | Loss: 32437.314
80/1000 [ 8%] ██ ETA: 31s | Loss: 30500.826
90/1000 [ 9%] ██ ETA: 27s | Loss: 31563.166
100/1000 [ 10%] ███ ETA: 25s | Loss: 32519.234
110/1000 [ 11%] ███ ETA: 22s | Loss: 30536.717
120/1000 [ 12%] ███ ETA: 20s | Loss: 29657.588
130/1000 [ 13%] ███ ETA: 19s | Loss: 29645.283
140/1000 [ 14%] ████ ETA: 17s | Loss: 29005.432
150/1000 [ 15%] ████ ETA: 16s | Loss: 28441.916
160/1000 [ 16%] ████ ETA: 15s | Loss: 29102.402
170/1000 [ 17%] █████ ETA: 14s | Loss: 29126.168
180/1000 [ 18%] █████ ETA: 14s | Loss: 27804.846
190/1000 [ 19%] █████ ETA: 13s | Loss: 27522.092
200/1000 [ 20%] ██████ ETA: 12s | Loss: 26884.270
210/1000 [ 21%] ██████ ETA: 11s | Loss: 27859.730
220/1000 [ 22%] ██████ ETA: 11s | Loss: 27417.627
230/1000 [ 23%] ██████ ETA: 10s | Loss: 26783.922
240/1000 [ 24%] ███████ ETA: 10s | Loss: 26546.973
250/1000 [ 25%] ███████ ETA: 9s | Loss: 26426.189 
260/1000 [ 26%] ███████ ETA: 9s | Loss: 25795.391
270/1000 [ 27%] ████████ ETA: 9s | Loss: 25834.910
280/1000 [ 28%] ████████ ETA: 8s | Loss: 26380.096
290/1000 [ 28%] ████████ ETA: 8s | Loss: 25364.367
300/1000 [ 30%] █████████ ETA: 8s | Loss: 25868.742
310/1000 [ 31%] █████████ ETA: 7s | Loss: 25538.361
320/1000 [ 32%] █████████ ETA: 7s | Loss: 25502.686
330/1000 [ 33%] █████████ ETA: 7s | Loss: 25245.451
340/1000 [ 34%] ██████████ ETA: 7s | Loss: 25246.352
350/1000 [ 35%] ██████████ ETA: 6s | Loss: 25162.387
360/1000 [ 36%] ██████████ ETA: 6s | Loss: 25069.322
370/1000 [ 37%] ███████████ ETA: 6s | Loss: 25221.738
380/1000 [ 38%] ███████████ ETA: 6s | Loss: 25002.721
390/1000 [ 39%] ███████████ ETA: 6s | Loss: 25092.359
400/1000 [ 40%] ████████████ ETA: 5s | Loss: 25132.889
410/1000 [ 41%] ████████████ ETA: 5s | Loss: 25073.758
420/1000 [ 42%] ████████████ ETA: 5s | Loss: 24993.922
430/1000 [ 43%] ████████████ ETA: 5s | Loss: 25040.383
440/1000 [ 44%] █████████████ ETA: 5s | Loss: 25178.037
450/1000 [ 45%] █████████████ ETA: 4s | Loss: 25058.529
460/1000 [ 46%] █████████████ ETA: 4s | Loss: 25128.596
470/1000 [ 47%] ██████████████ ETA: 4s | Loss: 24322.984
480/1000 [ 48%] ██████████████ ETA: 4s | Loss: 24488.453
490/1000 [ 49%] ██████████████ ETA: 4s | Loss: 24330.529
500/1000 [ 50%] ███████████████ ETA: 4s | Loss: 22083.994
510/1000 [ 51%] ███████████████ ETA: 4s | Loss: 22868.793
520/1000 [ 52%] ███████████████ ETA: 3s | Loss: 24544.654
530/1000 [ 53%] ███████████████ ETA: 3s | Loss: 21279.506
540/1000 [ 54%] ████████████████ ETA: 3s | Loss: 20463.666
550/1000 [ 55%] ████████████████ ETA: 3s | Loss: 27628.227
560/1000 [ 56%] ████████████████ ETA: 3s | Loss: 19708.398
570/1000 [ 56%] █████████████████ ETA: 3s | Loss: 19688.635
580/1000 [ 57%] █████████████████ ETA: 3s | Loss: 18488.395
590/1000 [ 59%] █████████████████ ETA: 3s | Loss: 19156.332
600/1000 [ 60%] ██████████████████ ETA: 3s | Loss: 18869.977
610/1000 [ 61%] ██████████████████ ETA: 2s | Loss: 18203.219
620/1000 [ 62%] ██████████████████ ETA: 2s | Loss: 17688.777
630/1000 [ 63%] ██████████████████ ETA: 2s | Loss: 17610.537
640/1000 [ 64%] ███████████████████ ETA: 2s | Loss: 17206.801
650/1000 [ 65%] ███████████████████ ETA: 2s | Loss: 17768.758
660/1000 [ 66%] ███████████████████ ETA: 2s | Loss: 17340.586
670/1000 [ 67%] ████████████████████ ETA: 2s | Loss: 16991.949
680/1000 [ 68%] ████████████████████ ETA: 2s | Loss: 16846.303
690/1000 [ 69%] ████████████████████ ETA: 2s | Loss: 16949.816
700/1000 [ 70%] █████████████████████ ETA: 2s | Loss: 16993.484
710/1000 [ 71%] █████████████████████ ETA: 2s | Loss: 16638.480
720/1000 [ 72%] █████████████████████ ETA: 1s | Loss: 16317.419
730/1000 [ 73%] █████████████████████ ETA: 1s | Loss: 16280.466
740/1000 [ 74%] ██████████████████████ ETA: 1s | Loss: 16300.282
750/1000 [ 75%] ██████████████████████ ETA: 1s | Loss: 15839.547
760/1000 [ 76%] ██████████████████████ ETA: 1s | Loss: 16042.021
770/1000 [ 77%] ███████████████████████ ETA: 1s | Loss: 15927.018
780/1000 [ 78%] ███████████████████████ ETA: 1s | Loss: 16287.718
790/1000 [ 79%] ███████████████████████ ETA: 1s | Loss: 15414.071
800/1000 [ 80%] ████████████████████████ ETA: 1s | Loss: 15706.084
810/1000 [ 81%] ████████████████████████ ETA: 1s | Loss: 15635.123
820/1000 [ 82%] ████████████████████████ ETA: 1s | Loss: 15539.277
830/1000 [ 83%] ████████████████████████ ETA: 1s | Loss: 15565.960
840/1000 [ 84%] █████████████████████████ ETA: 1s | Loss: 15730.837
850/1000 [ 85%] █████████████████████████ ETA: 0s | Loss: 15203.399
860/1000 [ 86%] █████████████████████████ ETA: 0s | Loss: 15225.261
870/1000 [ 87%] ██████████████████████████ ETA: 0s | Loss: 15382.370
880/1000 [ 88%] ██████████████████████████ ETA: 0s | Loss: 15091.038
890/1000 [ 89%] ██████████████████████████ ETA: 0s | Loss: 15390.651
900/1000 [ 90%] ███████████████████████████ ETA: 0s | Loss: 15299.218
910/1000 [ 91%] ███████████████████████████ ETA: 0s | Loss: 15153.571
920/1000 [ 92%] ███████████████████████████ ETA: 0s | Loss: 15229.367
930/1000 [ 93%] ███████████████████████████ ETA: 0s | Loss: 15405.649
940/1000 [ 94%] ████████████████████████████ ETA: 0s | Loss: 15173.038
950/1000 [ 95%] ████████████████████████████ ETA: 0s | Loss: 15081.521
960/1000 [ 96%] ████████████████████████████ ETA: 0s | Loss: 15061.621
970/1000 [ 97%] █████████████████████████████ ETA: 0s | Loss: 15009.588
980/1000 [ 98%] █████████████████████████████ ETA: 0s | Loss: 15006.588
990/1000 [ 99%] █████████████████████████████ ETA: 0s | Loss: 15046.604
1000/1000 [100%] ██████████████████████████████ Elapsed: 6s | Loss: 14922.646/usr/local/lib/python3.6/site-packages/edward/util/random_variables.py:52: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
not np.issubdtype(value.dtype, np.float) and \
output saved in /scratch/irc/personal/robrechtc/tmp//RtmpaIKkMM/output:
pseudotime.csv
timings.json