DRecPy.Evaluation.Metrics package¶
DRecPy.Evaluation.Metrics.ranking module¶
-
class
DRecPy.Evaluation.Metrics.ranking.
AveragePrecision
¶ Bases:
DRecPy.Evaluation.Metrics.ranking.RankingMetricABC
Average Precision at k.
-
class
DRecPy.Evaluation.Metrics.ranking.
DCG
(strong_relevancy=True)¶ Bases:
DRecPy.Evaluation.Metrics.ranking.RankingMetricABC
Discounted Cumulative Gain at k.
Parameters: strong_relevancy – An optional boolean indicating which variant of the DCG is used. If set to True, usually results in smaller values than when it’s set to False. Default: True.
-
class
DRecPy.Evaluation.Metrics.ranking.
FScore
(beta=1)¶ Bases:
DRecPy.Evaluation.Metrics.ranking.RankingMetricABC
F-score at k.
Parameters: beta – An optional integer representing the weight of the recall value on the combined score. Beta > 1 favours recall over precision, while beta < 1 favours precision over recall. Default: 1.
-
class
DRecPy.Evaluation.Metrics.ranking.
HitRatio
¶ Bases:
DRecPy.Evaluation.Metrics.ranking.RankingMetricABC
Hit Ratio at k.
-
class
DRecPy.Evaluation.Metrics.ranking.
NDCG
(strong_relevancy=True)¶ Bases:
DRecPy.Evaluation.Metrics.ranking.RankingMetricABC
Discounted Cumulative Gain at k.
Parameters: strong_relevancy – An optional boolean indicating which variant of the DCG is used. If set to True, usually results in smaller values than when it’s set to False. Default: True.
-
class
DRecPy.Evaluation.Metrics.ranking.
Precision
¶ Bases:
DRecPy.Evaluation.Metrics.ranking.RankingMetricABC
Precision at k.
-
class
DRecPy.Evaluation.Metrics.ranking.
RankingMetricABC
¶ Bases:
DRecPy.Evaluation.Metrics.metric_abc.MetricABC
-
class
DRecPy.Evaluation.Metrics.ranking.
Recall
¶ Bases:
DRecPy.Evaluation.Metrics.ranking.RankingMetricABC
Recall at k.
-
class
DRecPy.Evaluation.Metrics.ranking.
ReciprocalRank
¶ Bases:
DRecPy.Evaluation.Metrics.ranking.RankingMetricABC
Reciprocal Rank at k.
DRecPy.Evaluation.Metrics.regression module¶
-
class
DRecPy.Evaluation.Metrics.regression.
MAE
¶ Bases:
DRecPy.Evaluation.Metrics.regression.PredictiveMetricABC
Mean absolute Error.
-
class
DRecPy.Evaluation.Metrics.regression.
MSE
¶ Bases:
DRecPy.Evaluation.Metrics.regression.PredictiveMetricABC
Mean Squared Error.
-
class
DRecPy.Evaluation.Metrics.regression.
PredictiveMetricABC
¶ Bases:
DRecPy.Evaluation.Metrics.metric_abc.MetricABC
-
class
DRecPy.Evaluation.Metrics.regression.
RMSE
¶ Bases:
DRecPy.Evaluation.Metrics.regression.PredictiveMetricABC
Root Mean Squared Error.