Skip to content

Bug? #18

@bowen-xu

Description

@bowen-xu

In class PrimaryCaps, the forward function is defined as the following

def forward(self, x):
    u = [capsule(x) for capsule in self.capsules]
    u = torch.stack(u, dim=1)
    u = u.view(x.size(0), 32 * 6 * 6, -1)
    return self.squash(u)

by u = torch.stack(u, dim=1), the shape of u becomes [100, 8, 32, 6, 6]. However, u.view(x.size(0), 32 * 6 * 6, -1) just flattens the wrong dimensions. I think the correct code should be u = torch.stack(u, dim=-1).

Is is kind of a bug? or do I misunderstand it?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions