-
Notifications
You must be signed in to change notification settings - Fork 6
/
Copy pathREADME.Rmd
70 lines (45 loc) · 2.83 KB
/
README.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
---
output: github_document
---
<!-- README.md is generated from README.Rmd. Please edit that file -->
```{r, include = FALSE}
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>",
fig.path = "man/figures/README-",
out.width = "100%"
)
```
# fastadi
<!-- badges: start -->
[](https://github.com/RoheLab/fastadi/actions/workflows/R-CMD-check.yaml)
[](https://app.codecov.io/gh/RoheLab/fastadi?branch=main)
<!-- badges: end -->
`fastadi` implements the `AdaptiveImpute` matrix completion algorithm. `fastadi` is a self-tuning alternative to algorithms such as `SoftImpute` (implemented in the [`softImpute`](https://cran.r-project.org/package=softImpute) package), truncated SVD, maximum margin matrix factorization, and weighted regularized matrix factorization (implemented in the [`rsparse`](https://github.com/rexyai/rsparse) package). In simulations `fastadi` often outperforms `softImpute` by a small margin.
You may find `fastadi` useful if you are developing embeddings for sparsely observed data, if you are working in natural language processing, or building a recommendation system.
## Installation
You can install the released version from [CRAN](https://cran.r-project.org/) with:
``` r
install.packages("fastadi")
```
You can install the development version from [GitHub](https://github.com/) with:
``` r
# install.packages("devtools")
devtools::install_github("RoheLab/fastadi")
```
## Example usage
Here we embed users and items in the MovieLens 100K dataset.
```{r}
library(fastadi)
mf <- adaptive_impute(ml100k, rank = 3L, max_iter = 5L)
```
```{r}
mf
```
Note that the vignettes are currently scratch work for reference by the developers and are not yet ready for general consumption.
## References
1. Alex Hayes and Karl Rohe. "Finding Topics in Citation Data". 2022+
2. Cho, Juhee, Donggyu Kim, and Karl Rohe. “Asymptotic Theory for Estimating the Singular Vectors and Values of a Partially-Observed Low Rank Matrix with Noise.” Statistica Sinica, 2018. https://doi.org/10.5705/ss.202016.0205.
3. ———. “Intelligent Initialization and Adaptive Thresholding for Iterative Matrix Completion: Some Statistical and Algorithmic Theory for Adaptive-Impute.” Journal of Computational and Graphical Statistics 28, no. 2 (April 3, 2019): 323–33. https://doi.org/10.1080/10618600.2018.1518238.
4. Mazumder, Rahul, Trevor Hastie, and Robert Tibshirani. “Spectral Regularization Algorithms for Learning Large Incomplete Matrices.” Journal of Machine Learning Research, 2010. https://web.stanford.edu/~hastie/Papers/mazumder10a.pdf.
You can find the original implementation accompanying these papers [here](https://github.com/chojuhee/hello-world).