Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Removes confusing zero mask from VilBERT #5264

Merged
merged 5 commits into from
Jun 17, 2021
Merged

Removes confusing zero mask from VilBERT #5264

merged 5 commits into from
Jun 17, 2021

Conversation

dirkgr
Copy link
Member

@dirkgr dirkgr commented Jun 15, 2021

I found this while trolling through @jacob-morrison's code. That mask is unused later. If it were used, it would be wrong.

@dirkgr dirkgr requested review from epwalsh and jacob-morrison June 15, 2021 17:59
@dirkgr dirkgr self-assigned this Jun 15, 2021
Copy link
Contributor

@jacob-morrison jacob-morrison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we'll have problems in the bimodal encoder if we don't pass in the mask: /~https://github.com/allenai/allennlp/blob/main/allennlp/modules/transformer/bimodal_encoder.py#L194-L199. I think the problem is actually in the encoder though (ie we should still delete this useless mask in the backbone), because co_attention_mask's default value is None, but I'm pretty sure these lines will break if it is None.

@dirkgr
Copy link
Member Author

dirkgr commented Jun 16, 2021

I think that fixes it, but let's see what the tests say.

Copy link
Contributor

@jacob-morrison jacob-morrison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@dirkgr dirkgr enabled auto-merge (squash) June 17, 2021 17:18
@dirkgr dirkgr merged commit af101d6 into main Jun 17, 2021
@dirkgr dirkgr deleted the ZeroMask branch June 17, 2021 17:32
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants