Commit History

Removed unnecessary parameter
8172944

PeteBleackley commited on

get_input_embeddings() directly from base model
e095479

PeteBleackley commited on

Missing 'from_pretrained'
215b416

PeteBleackley commited on

config didn't need to be a property
0abed2a

PeteBleackley commited on

There's a simpler way of doing this, I hope
858f75e

PeteBleackley commited on

Might be simpler to inherit from RobertaModel rather than PreTrainedModel
f0ad7f1

PeteBleackley commited on

Removed a base model that was causing a loop in model initialisation
87535ff

PeteBleackley commited on

Problems with config
2f6dc26

PeteBleackley commited on

Fixed typo
1d631bc

PeteBleackley commited on

Removed line that would have failed
dbfe7ff

PeteBleackley commited on

Fixed import
acda749

PeteBleackley commited on

Typo
ed62a1c

PeteBleackley commited on

Further changes for compatibility with HuggingFace Pytorch implementation
5b7a8ed

PeteBleackley commited on

Minor error in setting up tokenizer
fb4c0b0

PeteBleackley commited on

PyTorch implementation of HugggingFace PreTrainedModel class does not allow direct setting of base_model. Rejig constructors accordingly
519dfd1

PeteBleackley commited on

Removed superfluous ()
4cda7b6

PeteBleackley commited on

Removed superfluous ()
518e821

PeteBleackley commited on

Corrected inheritance
8823ce8

PeteBleackley commited on

Updated requitements.txt
bacc9ad

PeteBleackley commited on

Modified CombinedCorpus to use PyTorch
7a9be99

PeteBleackley commited on

Modified training scripts to use PyTorch
c8625dc

PeteBleackley commited on

Converted QaracTrainerModel to use PyTorch
56e5680

PeteBleackley commited on

Converted QaracDecoderModel to use PyTorch
13f1508

PeteBleackley commited on

Converted QaracEncoderModel to use PyTorch
37a581e

PeteBleackley commited on

Converted GlobalAttentionPoolingHead to use PyTorch
32df2f1

PeteBleackley commited on

items, not values
ac98be7

PeteBleackley commited on

Removed diagnostics
8561f47

PeteBleackley commited on

Removed unnecessary files
465b3db

PeteBleackley commited on

Use dictionary coprehensions to do padding and return dictionaries instead of default dictionaries
7a61dc8

PeteBleackley commited on

Implement max_lengths for CorpusRepeater
2d04d62

PeteBleackley commited on

Forgot a len
be7beac

PeteBleackley commited on

TPUs need constant batch shapes
fcfc2b3

PeteBleackley commited on

Completed script for testing consistency
dd9c3ed

PeteBleackley commited on

Trainable => trainable
a8c528d

PeteBleackley commited on

Ensure weights are trainable
e556cb6

PeteBleackley commited on

Fixed name of argument
1b76f7d

PeteBleackley commited on

Tensor is logits
888010e

PeteBleackley commited on

Removed extraneous self
ae31ae3

PeteBleackley commited on

The other layer returned a tuple as well
095f432

PeteBleackley commited on

Low level RoBERTa layers don't necessarily return what I expect them to
0941a89

PeteBleackley commited on

Fixed typo
50de02e

PeteBleackley commited on

Needed more arguments
58d8758

PeteBleackley commited on

Arguments to Concatenate layer should be in a list
30efe84

PeteBleackley commited on

Fixed arguments to decoder head
7b59e3d

PeteBleackley commited on

Incoplete testing script for consistency. Fixed typo
14f1a57

PeteBleackley commited on

Testing script for reasoning
6d6bb62

PeteBleackley commited on

Testing scripts
65ae142

PeteBleackley commited on

Attention masks, generation, and testing script
6ebe943

PeteBleackley commited on

Attention masks are only necessary for inputs
2802fd8

PeteBleackley commited on

Making sure RoBERTa layers have all required arguments
b2593fa

PeteBleackley commited on