Using pretrained models from mmf

Hi, I have loaded in the Visual BERT model pretrained on COCO from mmf, but I am getting 0.422 AUROC when I run forward() on the validation data. I am tokenizing the text using a bert tokenizer and setting the input_mask to all 1’s and the segment_ids to all 0’s. The output from forward is

{‘scores’: tensor([[ 3.3944, -3.3940],
[ 4.1372, -4.5100],
[ 4.0722, -4.3152],
[ 3.7807, -4.3434],
[ 4.1213, -4.4178],

And after applying softmax every prediction is almost 0.

Has anyone been able to get the pretrained model to work in code and not from running the command? Any help is greatly appreciated.

Hi,
MMF creator here.

This has been fixed in the master branch of MMF. You can either download the new pip package or pull the latest source.

1 Like

Thanks for the reply. I tried uninstalling mmf and reinstalling using

git clone https://github.com/facebookresearch/mmf.git
cd mmf
pip install --editable .

But I still get the same predictions from the pretrained network as in the original post. Any additional help would be very appreciated. Thank you.

Ok, let’s move to mmf repo. Can you open up the full issue, with the exact commands you ran?

1 Like