Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix understand_sentiment_lstm.py #16

Closed
wants to merge 4 commits into from

Conversation

chengduoZH
Copy link
Collaborator

@chengduoZH chengduoZH commented Dec 12, 2017

  • When the last batch of dataset is less than batch_size, the "understand_sentiment_lstm.py" run will go wrong.
Traceback (most recent call last):
  File "/usr/lib/python2.7/pdb.py", line 1314, in main
    pdb._runscript(mainpyfile)
  File "/usr/lib/python2.7/pdb.py", line 1233, in _runscript
    self.run(statement)
  File "/usr/lib/python2.7/bdb.py", line 400, in run
    exec cmd in globals, locals
  File "<string>", line 1, in <module>
  File "understand_sentiment_lstm.py", line 1, in <module>
    from __future__ import absolute_import
  File "understand_sentiment_lstm.py", line 163, in run_benchmark
    fetch_list=[avg_cost] + accuracy.metrics)
  File "/paddle/Paddle_release/Paddle/python/paddle/v2/fluid/executor.py", line 144, in run
    self.executor.run(program.desc, scope, 0, True)
EnforceNotMet: enforce capacity == in_size failed, 81920 != 20480
The size of Input(X) mismatches with Attr(shape). at [/paddle/Paddle_release/Paddle/paddle/operators/reshape_op.cc:51]
PaddlePaddle Call Stacks:
0       0x7f38a3bd9777p paddle::platform::EnforceNotMet::EnforceNotMet(std::__exception_ptr::exception_ptr, char const*, int) + 727
1       0x7f38a3e4500fp paddle::operators::ReshapeOp::InferShape(paddle::framework::InferShapeContext*) const + 2239
2       0x7f38a43e7d2dp paddle::framework::OperatorWithKernel::Run(paddle::framework::Scope const&, paddle::platform::DeviceContext const&) const + 573
3       0x7f38a3c83b45p paddle::framework::Executor::Run(paddle::framework::ProgramDescBind const&, paddle::framework::Scope*, int, bool) + 613
4       0x7f38a3beff8ap void pybind11::cpp_function::initialize<pybind11::cpp_function::initialize<void, paddle::framework::Executor, paddle::framework::ProgramDescBind const&, paddle::framework::Scope*, int, bool, pybind11::name, pybind11::is_method, pybind11::sibling>(void (paddle::framework::Executor::*)(paddle::framework::ProgramDescBind const&, paddle::framework::Scope*, int, bool), pybind11::name const&, pybind11::is_method const&, pybind11::sibling const&)::{lambda(paddle::framework::Executor*, paddle::framework::ProgramDescBind const&, paddle::framework::Scope*, int, bool)#1}, void, paddle::framework::Executor*, paddle::framework::ProgramDescBind const&, paddle::framework::Scope*, int, bool, pybind11::name, pybind11::is_method, pybind11::sibling>(pybind11::cpp_function::initialize<void, paddle::framework::Executor, paddle::framework::ProgramDescBind const&, paddle::framework::Scope*, int, bool, pybind11::name, pybind11::is_method, pybind11::sibling>(void (paddle::framework::Executor::*)(paddle::framework::ProgramDescBind const&, paddle::framework::Scope*, int, bool), pybind11::name const&, pybind11::is_method const&, pybind11::sibling const&)::{lambda(paddle::framework::Executor*, paddle::framework::ProgramDescBind const&, paddle::framework::Scope*, int, bool)#1}&&, void (*)(paddle::framework::Executor*, paddle::framework::ProgramDescBind const&, paddle::framework::Scope*, int, bool), pybind11::name const&, pybind11::is_method const&, pybind11::sibling const&)::{lambda(pybind11::detail::function_call&)#3}::_FUN(pybind11::detail::function_call) + 490
5       0x7f38a3bedcf4p pybind11::cpp_function::dispatcher(_object*, _object*, _object*) + 1236
  • In this PR, when the size of the last batch is less than batch_size, the program ignores this batch.
if len(data) < args.batch_size : continue
  • When the last batch of dataset is less than batch_size, the "understand_sentiment_dynamic_lstm.py" run will go wrong.
Traceback (most recent call last):
  File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
    exec code in run_globals
  File "/usr/local/lib/python2.7/dist-packages/yep.py", line 174, in <module>
    main()
  File "/usr/local/lib/python2.7/dist-packages/yep.py", line 148, in main
    __main__.__dict__)
  File "/paddle/benchmark_profiling/understand_sentiment_dynamic_lstm.py", line 158, in <module>
    run_benchmark(dynamic_lstm_model, args)
  File "/paddle/benchmark_profiling/understand_sentiment_dynamic_lstm.py", line 131, in run_benchmark
    label = label.reshape([args.batch_size, 1])
ValueError: cannot reshape array of size 8 into shape (32,1)
  • In this PR, when the size of the last batch is less than batch_size, the program resize label to [len(data), 1].
label = label.reshape([len(data), 1])

for data in train_reader():
chopped_data = chop_data(
data, chop_len=args.seq_len, batch_size=args.batch_size)
if len(data) < args.batch_size : continue
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

File "/paddle/Paddle_release/Paddle/python/paddle/v2/fluid/executor.py", line 144, in run
self.executor.run(program.desc, scope, 0, True)
EnforceNotMet: enforce capacity == in_size failed, 81920 != 20480

Maybe this is a bug of operator. I do not think it needs to skip here.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have notified the relevant person. PaddlePaddle/Paddle#6533

@chengduoZH chengduoZH force-pushed the fix/lstm_data_shape branch 3 times, most recently from 11604aa to ba95603 Compare January 8, 2018 07:30
@chengduoZH
Copy link
Collaborator Author

当前运行understand_sentiment_lstm.py时,如果数据集最后一个batch的样本个数少于指定的BATCH_SIZE,程序仍然会在reshape_op里面挂掉。后续会把reshape_op中shape属性作为输入,修改之后就不会存在这样的问题。

@chengduoZH chengduoZH force-pushed the fix/lstm_data_shape branch from ba95603 to 5d79960 Compare January 8, 2018 07:57
@chengduoZH chengduoZH closed this Jan 8, 2018
@chengduoZH chengduoZH reopened this Jan 8, 2018
@chengduoZH chengduoZH closed this Jan 8, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants