Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add 4 non-fault-tolerant demos #307

Merged
merged 1 commit into from
Aug 17, 2017
Merged

Conversation

helinwang
Copy link
Collaborator

@helinwang helinwang commented Aug 14, 2017

Related: #147

@helinwang helinwang requested a review from Yancey1989 August 14, 2017 20:30
files_current_train = []
# TODO(helin): remove this once paddle.v2.reader.creator.recordio is
# fixed.
def recordio(paths, buf_size=100):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As this implement for reader, each trainer will fetch the same training data, maybe the trainer would fetch a part of the whole training data, or i missed something?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In non fault tolerant mode, readers must fetch part of the training data by it self.

Copy link
Collaborator Author

@helinwang helinwang Aug 16, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Sharding does not work correctly before #319 and #318 is fixed, can I merge this first and submit a follow up PR for sharding after these issues being resloved?

The planned code is:

def get_shard(paths):
    trainer_count = int(os.getenv("PADDLE_INIT_NUM_GRADIENT_SERVERS"))
    files = glob.glob(paths)
    files.sort()
    flie_count = len(files)
    files_current_train = []
    for idx, fn in enumerate(files):
         if idx % trainer_count == trainer_id:
             files_current_train.append(fn)                                                                                                                                                                
    return files_current_train

Copy link
Collaborator

@Yancey1989 Yancey1989 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@helinwang helinwang merged commit 553a02e into PaddlePaddle:develop Aug 17, 2017
@helinwang helinwang deleted the demo branch August 17, 2017 04:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants