deepctr_torch.models.afm module

Author:
Weichen Shen,weichenswc@163.com
Reference:
[1] Xiao J, Ye H, He X, et al. Attentional factorization machines: Learning the weight of feature interactions via attention networks[J]. arXiv preprint arXiv:1708.04617, 2017. (https://arxiv.org/abs/1708.04617)
class deepctr_torch.models.afm.AFM(linear_feature_columns, dnn_feature_columns, use_attention=True, attention_factor=8, l2_reg_linear=1e-05, l2_reg_embedding=1e-05, l2_reg_att=1e-05, afm_dropout=0, init_std=0.0001, seed=1024, task='binary', device='cpu', gpus=None)[source]

Instantiates the Attentional Factorization Machine architecture.

Parameters:
  • linear_feature_columns – An iterable containing all the features used by linear part of the model.
  • dnn_feature_columns – An iterable containing all the features used by deep part of the model.
  • use_attention – bool,whether use attention or not,if set to False.it is the same as standard Factorization Machine
  • attention_factor – positive integer,units in attention net
  • l2_reg_linear – float. L2 regularizer strength applied to linear part
  • l2_reg_embedding – float. L2 regularizer strength applied to embedding vector
  • l2_reg_att – float. L2 regularizer strength applied to attention net
  • afm_dropout – float in [0,1), Fraction of the attention net output units to dropout.
  • init_std – float,to use as the initialize std of embedding vector
  • seed – integer ,to use as random seed.
  • task – str, "binary" for binary logloss or "regression" for regression loss
  • device – str, "cpu" or "cuda:0"
  • gpus – list of int or torch.device for multiple gpus. If None, run on device. gpus[0] should be the same gpu with device.
Returns:

A PyTorch model instance.

forward(X)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.