-
Notifications
You must be signed in to change notification settings - Fork 220
Home
A TensorFlow recommendation algorithm and framework in Python.
TensorRec is a Python recommendation system that allows you to quickly develop recommendation algorithms and customize them using TensorFlow.
TensorRec lets you to customize your recommendation system's embedding functions and loss functions while TensorRec handles the data manipulation, scoring, and ranking to generate recommendations.
A TensorRec system consumes three pieces of data: user_features
, item_features
, and interactions
. It uses this data to learn to make and rank recommendations.
For more information, and for an outline of this project, please read this blog post.
TensorRec can be installed via pip:
pip install tensorrec
import numpy as np
import tensorrec
# Build the model with default parameters
model = tensorrec.TensorRec()
# Generate some dummy data
interactions, user_features, item_features = tensorrec.util.generate_dummy_data(num_users=100,
num_items=150,
interaction_density=.05)
# Fit the model for 5 epochs
model.fit(interactions, user_features, item_features, epochs=5, verbose=True)
# Predict scores for all users and all items
predictions = model.predict(user_features=user_features,
item_features=item_features)
# Calculate and print the recall at 10
r_at_k = tensorrec.eval.recall_at_k(model, interactions,
k=10,
user_features=user_features,
item_features=item_features)
print(np.mean(r_at_k))
The following examples show what user/item features and interactions would look like in a TensorRec system meant to recommend business consulting projects (items) to consultants (users).
The data is represented in matrices. TensorRec can consume these matrices as any scipy.sparse
matrix.
Images from Medium
TensorRec allows you to define the algorithm that will be used to compute loss for a set of recommendation predictions.
You can define a custom loss function yourself, or you can use a pre-made loss function that comes with TensorRec in tensorrec.loss_graphs.
This loss function returns the root mean square error between the predictions and the true interactions.
Interactions can be any positive or negative values, and this loss function is sensitive to magnitude.
This loss function returns the root mean square error between the predictions and the true interactions, including all non-interacted values as 0s.
Interactions can be any positive or negative values, and this loss function is sensitive to magnitude.
This loss function models the explicit positive and negative interaction predictions as normal distributions and returns the probability of overlap between the two distributions.
Interactions can be any positive or negative values, but this loss function ignores the magnitude of the interaction -- interactions are grouped in to {i <= 0}
and {i > 0}
.
This loss function models all positive and negative interaction predictions as normal distributions and returns the probability of overlap between the two distributions. This loss function includes non-interacted items as negative interactions.
Interactions can be any positive or negative values, but this loss function ignores the magnitude of the interaction -- interactions are grouped in to {i <= 0}
and {i > 0}
.
Approximation of WMRB: Learning to Rank in a Scalable Batch Training Approach .
Interactions can be any positive values, but magnitude is ignored. Negative interactions are ignored.
This loss graph extends WMRB by making it sensitive to interaction magnitude and weighting the loss of each item by 1 / sum(interactions)
per item.
Interactions can be any positive values. Negative interactions are ignored.
import tensorflow as tf
import tensorrec
# Define a custom loss graph
class SimpleLossGraph(tensorrec.loss_graphs.AbstractLossGraph):
def connect_loss_graph(self, tf_prediction_serial, tf_interactions_serial, **kwargs):
"""
This loss function returns the absolute simple error between the predictions and the interactions.
:param tf_prediction_serial: tf.Tensor
The recommendation scores as a Tensor of shape [n_samples, 1]
:param tf_interactions_serial: tf.Tensor
The sample interactions corresponding to tf_prediction_serial as a Tensor of shape [n_samples, 1]
:param kwargs:
Other TensorFlow nodes.
:return:
A tf.Tensor containing the learning loss.
"""
return tf.reduce_mean(tf.abs(tf_interactions_serial - tf_prediction_serial))
# Build a model with the custom loss function
model = tensorrec.TensorRec(loss_graph=SimpleLossGraph())
# Generate some dummy data
interactions, user_features, item_features = tensorrec.util.generate_dummy_data(num_users=100,
num_items=150,
interaction_density=.05)
# Fit the model for 5 epochs
model.fit(interactions, user_features, item_features, epochs=5, verbose=True)