Commit 886c63bb authored by davidkep's avatar davidkep

Update README with example

parent ca9904ef
......@@ -11,5 +11,39 @@ Due to the early stage of the library, the interface might change considerably i
The C++ header files are in [inst/include](inst/include) and can be used from within other R packages by adding `nsoptim`
to the `LinkingTo` field in the package's DESCRIPTION file.
## Example Usage
#include <memory>
#include <nsoptim.hpp> // This also includes the armadillo library
// Alias for a linearized ADMM optimizer operating on the standard LS regression loss and an EN penalty using a dense coefficient vector.
using LinearizedAdmmOptimizer = nsoptim::LinearizedAdmmOptimizer<nsoptim::LsLoss, nsoptim::EnPenalty, nsoptim::RegressionCoefficients<arma::vec>>;
typename LinearizedAdmmOptimizer::Coefficients Foo() {
// Generate dummy data with 100 observations and 10 predictors.
auto data = std::make_shared<nsoptim::PredictorResponseData>(arma::randn(100, 10), arma::randn(10));
nsoptim::LsLoss loss(data); // Create a LS loss function object with the generated data
nsoptim::EnPenalty penalty1(0.5, 2.4) // Create an EN penalty function object with alpha=0.5 and lambda=2.4
nsoptim::EnPenalty penalty2(0.5, 1.5) // Create an EN penalty function object with alpha=0.5 and lambda=1.5
// Create an optimizer for the given loss and penalty function, using default parameters for the ADMM algorithm.
LinearizedAdmmOptimizer optim(loss, penalty1);
// Compute the optimum for `penalty1`, starting at the 0-vector.
typename LinearizedAdmmOptimizer::Optimum optimum = optim.Optimize();
// Change the penalty to `penalty2`.
// Compute the optimum for `penalty2`, starting at the optimum for `penalty1`.
typename LinearizedAdmmOptimizer::Optimum optimum = optim.Optimize();
// Only return the coefficients.
return optimum.coefs;
## Documentation
The documentation for the library is a work in progress. Currently, source code files include doxygen-style comments.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment