Skip to content

Commit

Permalink
Implement function for BERT quantization tutorial, resolves issue #1971
Browse files Browse the repository at this point in the history
… (#2403)

Co-authored-by: Carl Parker <carljparker@meta.com>
  • Loading branch information
JoseLuisC99 and carljparker authored Jun 2, 2023
1 parent fd9a6a7 commit b966c1f
Showing 1 changed file with 18 additions and 0 deletions.
18 changes: 18 additions & 0 deletions intermediate_source/dynamic_quantization_bert_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -255,6 +255,9 @@ model before and after the dynamic quantization.
torch.manual_seed(seed)
set_seed(42)
# Initialize a global random number generator
global_rng = random.Random()
2.2 Load the fine-tuned BERT model
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Expand Down Expand Up @@ -525,6 +528,21 @@ We can serialize and save the quantized model for the future use using

.. code:: python
def ids_tensor(shape, vocab_size, rng=None, name=None):
# Creates a random int32 tensor of the shape within the vocab size
if rng is None:
rng = global_rng
total_dims = 1
for dim in shape:
total_dims *= dim
values = []
for _ in range(total_dims):
values.append(rng.randint(0, vocab_size - 1))
return torch.tensor(data=values, dtype=torch.long, device='cpu').view(shape).contiguous()
input_ids = ids_tensor([8, 128], 2)
token_type_ids = ids_tensor([8, 128], 2)
attention_mask = ids_tensor([8, 128], vocab_size=2)
Expand Down

0 comments on commit b966c1f

Please sign in to comment.