@@ -30,6 +30,22 @@ See more details in the [ReHLine documentation](https://rehline-python.readthedo
3030pip install rehline
3131```
3232
33+ ### Development Install
34+
35+ For contributors and developers:
36+
37+ ``` bash
38+ git clone https://github.com/softmin/ReHLine-python.git
39+ cd ReHLine-python
40+ pip install -e " .[dev]"
41+ ```
42+
43+ To run tests:
44+
45+ ``` bash
46+ pytest tests/
47+ ```
48+
3349## 🚀 Quick Start
3450
3551### Scikit-Learn Style API (Recommended)
@@ -79,20 +95,27 @@ print(f"Best params: {grid_search.best_params_}")
7995from rehline import ReHLine
8096import numpy as np
8197
98+ # Generate sample data
99+ np.random.seed(42 )
100+ X = np.random.randn(100 , 5 )
101+ y = np.random.choice([- 1 , 1 ], size = 100 )
102+ n, d = X.shape
103+ C = 1.0
104+
82105# Define custom PLQ loss parameters
83106clf = ReHLine()
84107# Set custom U, V matrices for ReLU loss
85108# and S, T, tau for ReHU loss
86109# # U
87- clf.U = - (C* y).reshape(1 ,- 1 )
110+ clf._U = - (C* y).reshape(1 ,- 1 )
88111# # V
89- clf.V = (C* np.array(np. ones(n) )).reshape(1 ,- 1 )
112+ clf._V = (C* np.ones(n)).reshape(1 ,- 1 )
90113
91114# Set custom linear constraints A*beta + b >= 0
92115X_sen = X[:,0 ]
93116tol_sen = 0.1
94- clf.A = np.repeat([X_sen @ X], repeats = [2 ], axis = 0 ) / n
95- clf.A [1 ] = - clf.A [1 ]
117+ clf._A = np.repeat([X_sen @ X], repeats = [2 ], axis = 0 ) / n
118+ clf._A [1 ] = - clf._A [1 ]
96119
97120clf.fit(X)
98121```
@@ -113,7 +136,7 @@ ReHLine excels at solving a wide range of machine learning problems:
113136| ** Sparse Learning** | Feature selection with L1 regularization | Scales to high dimensions |
114137| ** Custom Optimization** | Any PLQ loss with linear constraints | Flexible framework for research |
115138
116- <!--
139+ <!--
117140## 📝 Formulation
118141
119142**ReHLine** is designed to address the empirical regularized ReLU-ReHU minimization problem, named *ReHLine optimization*, of the following form:
0 commit comments