Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Spaces:
usernameisanna
/
BLIP2-DeiT-VQA
like
0
Sleeping
App
Files
Files
Community
51a65cd
BLIP2-DeiT-VQA
2 contributors
History:
7 commits
usernameisanna
Upload full-blip2-deit.pth
51a65cd
verified
6 months ago
.gitattributes
Safe
1.52 kB
Duplicate from gradio-templates/chatbot
6 months ago
README.md
Safe
344 Bytes
initial commit
6 months ago
app.py
Safe
2.62 kB
update path
6 months ago
full-blip2-deit-config-2.pth
pickle
Detected Pickle imports (50)
"torch._C._nn.gelu"
,
"transformers.models.deit.modeling_deit.DeiTPatchEmbeddings"
,
"transformers.models.t5.configuration_t5.T5Config"
,
"transformers.models.t5.modeling_t5.T5Attention"
,
"transformers.models.deit.configuration_deit.DeiTConfig"
,
"transformers.models.t5.modeling_t5.T5LayerCrossAttention"
,
"transformers.models.t5.modeling_t5.T5Block"
,
"transformers.models.t5.modeling_t5.T5LayerFF"
,
"torch.nn.modules.activation.ReLU"
,
"torch.nn.modules.conv.Conv2d"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.nn.modules.activation.Tanh"
,
"transformers.models.deit.modeling_deit.DeiTSelfOutput"
,
"transformers.models.t5.modeling_t5.T5LayerNorm"
,
"torch._utils._rebuild_parameter"
,
"transformers.models.deit.modeling_deit.DeiTOutput"
,
"transformers.generation.configuration_utils.GenerationConfig"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerIntermediate"
,
"torch.nn.modules.dropout.Dropout"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerModel"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerAttention"
,
"transformers.models.t5.modeling_t5.T5LayerSelfAttention"
,
"transformers.models.blip_2.configuration_blip_2.Blip2QFormerConfig"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerSelfOutput"
,
"transformers.models.t5.modeling_t5.T5Stack"
,
"torch.nn.modules.normalization.LayerNorm"
,
"torch.nn.modules.container.ModuleList"
,
"transformers.activations.GELUActivation"
,
"transformers.models.deit.modeling_deit.DeiTModel"
,
"transformers.models.deit.modeling_deit.DeiTAttention"
,
"transformers.models.t5.modeling_t5.T5DenseActDense"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerOutput"
,
"transformers.models.deit.modeling_deit.DeiTEncoder"
,
"torch.float32"
,
"transformers.models.blip_2.configuration_blip_2.Blip2Config"
,
"transformers.models.deit.modeling_deit.DeiTLayer"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerMultiHeadAttention"
,
"transformers.models.blip_2.modeling_blip_2.Blip2ForConditionalGeneration"
,
"torch.FloatStorage"
,
"__builtin__.set"
,
"transformers.models.t5.modeling_t5.T5ForConditionalGeneration"
,
"transformers.models.deit.modeling_deit.DeiTIntermediate"
,
"transformers.models.deit.modeling_deit.DeiTPooler"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerEncoder"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerLayer"
,
"torch.nn.modules.sparse.Embedding"
,
"collections.OrderedDict"
,
"transformers.models.deit.modeling_deit.DeiTEmbeddings"
,
"transformers.models.deit.modeling_deit.DeiTSelfAttention"
,
"torch.nn.modules.linear.Linear"
How to fix it?
1.47 GB
LFS
Upload full-blip2-deit-config-2.pth
6 months ago
full-blip2-deit.pth
pickle
Detected Pickle imports (47)
"transformers.models.opt.modeling_opt.OPTModel"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerOutput"
,
"transformers.models.deit.modeling_deit.DeiTEmbeddings"
,
"torch.nn.modules.sparse.Embedding"
,
"transformers.models.opt.modeling_opt.OPTAttention"
,
"transformers.models.deit.modeling_deit.DeiTEncoder"
,
"transformers.models.deit.modeling_deit.DeiTSelfOutput"
,
"transformers.models.opt.modeling_opt.OPTLearnedPositionalEmbedding"
,
"transformers.models.deit.modeling_deit.DeiTPatchEmbeddings"
,
"transformers.models.deit.modeling_deit.DeiTSelfAttention"
,
"torch.nn.modules.container.ModuleList"
,
"transformers.models.blip_2.configuration_blip_2.Blip2QFormerConfig"
,
"torch.nn.modules.normalization.LayerNorm"
,
"transformers.models.opt.modeling_opt.OPTDecoderLayer"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerModel"
,
"torch.nn.modules.activation.ReLU"
,
"transformers.generation.configuration_utils.GenerationConfig"
,
"torch._C._nn.gelu"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerMultiHeadAttention"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerEncoder"
,
"torch.FloatStorage"
,
"transformers.activations.GELUActivation"
,
"transformers.models.deit.modeling_deit.DeiTPooler"
,
"transformers.models.opt.modeling_opt.OPTForCausalLM"
,
"torch.nn.modules.activation.Tanh"
,
"transformers.models.deit.configuration_deit.DeiTConfig"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerSelfOutput"
,
"torch._utils._rebuild_parameter"
,
"transformers.models.blip_2.configuration_blip_2.Blip2Config"
,
"torch.float32"
,
"transformers.models.deit.modeling_deit.DeiTIntermediate"
,
"transformers.models.deit.modeling_deit.DeiTOutput"
,
"collections.OrderedDict"
,
"transformers.models.deit.modeling_deit.DeiTModel"
,
"transformers.models.deit.modeling_deit.DeiTAttention"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerAttention"
,
"torch.nn.modules.dropout.Dropout"
,
"torch._utils._rebuild_tensor_v2"
,
"transformers.models.blip_2.modeling_blip_2.Blip2ForConditionalGeneration"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerLayer"
,
"transformers.models.deit.modeling_deit.DeiTLayer"
,
"torch.nn.modules.linear.Linear"
,
"torch.nn.modules.conv.Conv2d"
,
"transformers.models.opt.configuration_opt.OPTConfig"
,
"transformers.models.blip_2.modeling_blip_2.Blip2QFormerIntermediate"
,
"transformers.models.opt.modeling_opt.OPTDecoder"
,
"__builtin__.set"
How to fix it?
1.51 GB
LFS
Upload full-blip2-deit.pth
6 months ago
requirements.txt
Safe
59 Bytes
update requirements.txt
6 months ago