Spaces:
Sleeping
Sleeping
download
history
blame
354 MB
Detected Pickle imports (237)
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9466.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9516.MultiheadAttention",
- "__torch__.torch.nn.modules.container.___torch_mangle_9584.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9417.QuickGELU",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9415.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9386._LinearWithBias",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9477.Transformer",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9436.Linear",
- "__torch__.torch.nn.modules.container.___torch_mangle_9464.Sequential",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9534.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9465.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9453.QuickGELU",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9431._LinearWithBias",
- "__torch__.torch.nn.modules.conv.___torch_mangle_9366.Conv2d",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9403.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.container.___torch_mangle_9419.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9537.QuickGELU",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9499.LayerNorm",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9570.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9448.ResidualAttentionBlock",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9444.QuickGELU",
- "__torch__.torch.nn.modules.container.___torch_mangle_9374.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9475.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9581.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9377._LinearWithBias",
- "__torch__.torch.nn.modules.container.___torch_mangle_9521.Sequential",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9378.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9588.Transformer",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9379.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9569._LinearWithBias",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9560._LinearWithBias",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9421.ResidualAttentionBlock",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9495.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9586.ResidualAttentionBlock",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9394.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.container.___torch_mangle_9392.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9571.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9390.QuickGELU",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9400.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9522.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9555.QuickGELU",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9505.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9443.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9542._LinearWithBias",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9506._LinearWithBias",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9507.MultiheadAttention",
- "__torch__.torch.nn.modules.container.___torch_mangle_9401.Sequential",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9398.Linear",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9450.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9447.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9470.Linear",
- "__torch__.torch.nn.modules.container.___torch_mangle_9446.Sequential",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9382.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9481.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9576.LayerNorm",
- "__torch__.torch.nn.modules.container.___torch_mangle_9428.Sequential",
- "__torch__.torch.nn.modules.sparse.___torch_mangle_9589.Embedding",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9472.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9545.Linear",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9480.MultiheadAttention",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9369.MultiheadAttention",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9533._LinearWithBias",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9484.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9574.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9384.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9541.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9547.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9452.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9367.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9567.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9408.QuickGELU",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9540.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9442.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9573.QuickGELU",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9559.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9387.MultiheadAttention",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9551._LinearWithBias",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9546.QuickGELU",
- "__torch__.torch.nn.modules.container.___torch_mangle_9512.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9523.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9556.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9536.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9463.Linear",
- "collections.OrderedDict",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9397.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9493.Linear",
- "__torch__.torch.nn.modules.container.___torch_mangle_9383.Sequential",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9579.MultiheadAttention",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9440._LinearWithBias",
- "__torch__.torch.nn.modules.container.___torch_mangle_9548.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9550.ResidualAttentionBlock",
- "torch.HalfStorage",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9445.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9406.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9457.ResidualAttentionBlock",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9535.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9531.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9558.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9389.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9590.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9462.QuickGELU",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9515._LinearWithBias",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9407.Linear",
- "__torch__.multimodal.model.multimodal_transformer.VisualTransformer",
- "torch._utils._rebuild_tensor_v2",
- "torch.FloatStorage",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9451.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9478.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9497._LinearWithBias",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9469.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9501.QuickGELU",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9554.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9454.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9458._LinearWithBias",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9544.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9580.LayerNorm",
- "__torch__.torch.nn.modules.container.___torch_mangle_9566.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9429.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9449._LinearWithBias",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9474.LayerNorm",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9432.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9585.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9517.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9491.Linear",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9468.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9549.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9399.QuickGELU",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9479._LinearWithBias",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9527.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9526.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9568.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9405.MultiheadAttention",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9489.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9376.ResidualAttentionBlock",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9564.QuickGELU",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9563.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9411.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9591.Multimodal",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9423.MultiheadAttention",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9391.Linear",
- "__torch__.torch.nn.modules.container.___torch_mangle_9575.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9504.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9513.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9500.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9483.QuickGELU",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9438.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9375.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9514.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9395._LinearWithBias",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9418.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9370.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9402.LayerNorm",
- "__torch__.torch.nn.modules.container.___torch_mangle_9437.Sequential",
- "__torch__.torch.nn.modules.container.___torch_mangle_9455.Sequential",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9396.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9519.QuickGELU",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9427.Linear",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9441.MultiheadAttention",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9461.Linear",
- "__torch__.torch.nn.modules.container.___torch_mangle_9503.Sequential",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9538.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9413._LinearWithBias",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9380.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9509.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9368._LinearWithBias",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9412.ResidualAttentionBlock",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9492.QuickGELU",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9553.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9572.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9433.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9373.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9456.LayerNorm",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9459.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9532.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.container.___torch_mangle_9530.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9496.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9518.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9409.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9562.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9424.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9422._LinearWithBias",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9510.QuickGELU",
- "__torch__.torch.nn.modules.container.___torch_mangle_9494.Sequential",
- "__torch__.torch.nn.modules.container.___torch_mangle_9473.Sequential",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9371.Linear",
- "__torch__.torch.nn.modules.container.___torch_mangle_9587.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9471.QuickGELU",
- "__torch__.torch.nn.modules.container.___torch_mangle_9539.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9381.QuickGELU",
- "__torch__.torch.nn.modules.container.___torch_mangle_9476.Sequential",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9488._LinearWithBias",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9467._LinearWithBias",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9529.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9372.QuickGELU",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9525.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9528.QuickGELU",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9511.Linear",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9552.MultiheadAttention",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9578._LinearWithBias",
- "torch.LongStorage",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9416.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9430.ResidualAttentionBlock",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9487.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.container.___torch_mangle_9557.Sequential",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9486.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9435.QuickGELU",
- "__torch__.torch.nn.modules.container.___torch_mangle_9410.Sequential",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9565.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9508.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9582.QuickGELU",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9482.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9434.Linear",
- "__torch__.torch.nn.modules.container.___torch_mangle_9485.Sequential",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9425.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9388.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9420.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9385.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9414.MultiheadAttention",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9426.QuickGELU",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9393.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9439.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9404._LinearWithBias",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9460.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9502.Linear",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9543.MultiheadAttention",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9561.MultiheadAttention",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9520.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9583.Linear",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9490.LayerNorm",
- "__torch__.multimodal.model.multimodal_transformer.___torch_mangle_9577.ResidualAttentionBlock",
- "__torch__.torch.nn.modules.linear.___torch_mangle_9524._LinearWithBias",
- "__torch__.torch.nn.modules.activation.___torch_mangle_9498.MultiheadAttention",
- "torch.LongStorage",
- "torch._utils._rebuild_tensor_v2",
- "torch.DoubleStorage",
- "torch.HalfStorage",
- "collections.OrderedDict"
Git LFS Details
- SHA256: 40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af
- Pointer size: 134 Bytes
- Size of remote file: 354 MB
Git Large File Storage (LFS) replaces large files with text pointers inside Git, while storing the file contents on a remote server. More info.