You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The function supports various loss functions, including:
514
+
- 'hinge', 'svm' or 'SVM'
515
+
- 'check' or 'quantile' or 'quantile regression' or 'QR'
516
+
- 'sSVM' or 'smooth SVM' or 'smooth hinge'
517
+
- 'TV'
518
+
- 'huber' or 'Huber'
519
+
- 'SVR' or 'svr'
520
+
521
+
The following constraint types are supported:
522
+
* 'nonnegative' or '>=0': A non-negativity constraint.
523
+
* 'fair' or 'fairness': A fairness constraint.
524
+
* 'custom': A custom constraint, where the user must provide the constraint matrix 'A' and vector 'b'.
525
+
526
+
Parameters
527
+
----------
528
+
loss : dict
529
+
A dictionary specifying the loss function parameters.
530
+
531
+
constraint : list of dict
532
+
A list of dictionaries, where each dictionary represents a constraint.
533
+
Each dictionary must contain a 'name' key, which specifies the type of constraint.
534
+
535
+
C : float, default=1.0
536
+
Regularization parameter. The strength of the regularization is
537
+
inversely proportional to C. Must be strictly positive.
538
+
`C` will be absorbed by the ReHLine parameters when `self.make_ReLHLoss` is conducted.
539
+
540
+
l1_ratio : float, default=0.5
541
+
The ElasticNet mixing parameter, with 0 <= l1_ratio < 1. For l1_ratio = 0 the penalty
542
+
is an L2 penalty. For 0 < l1_ratio < 1, the penalty is a combination of L1 and L2.
543
+
544
+
verbose : int, default=0
545
+
Enable verbose output. Note that this setting takes advantage of a
546
+
per-process runtime setting in liblinear that, if enabled, may not work
547
+
properly in a multithreaded context.
548
+
549
+
max_iter : int, default=1000
550
+
The maximum number of iterations to be run.
551
+
552
+
_U, _V: array of shape (L, n_samples), default=np.empty(shape=(0, 0))
553
+
The parameters pertaining to the ReLU part in the loss function.
554
+
555
+
_Tau, _S, _T: array of shape (H, n_samples), default=np.empty(shape=(0, 0))
556
+
The parameters pertaining to the ReHU part in the loss function.
557
+
558
+
_A: array of shape (K, n_features), default=np.empty(shape=(0, 0))
559
+
The coefficient matrix in the linear constraint.
560
+
561
+
_b: array of shape (K, ), default=np.empty(shape=0)
562
+
The intercept vector in the linear constraint.
563
+
564
+
Attributes
565
+
----------
566
+
coef\_ : array-like
567
+
The optimized model coefficients.
568
+
569
+
n_iter\_ : int
570
+
The number of iterations performed by the ReHLine solver.
571
+
572
+
opt_result\_ : object
573
+
The optimization result object.
574
+
575
+
dual_obj\_ : array-like
576
+
The dual objective function values.
577
+
578
+
primal_obj\_ : array-like
579
+
The primal objective function values.
580
+
581
+
Methods
582
+
-------
583
+
fit(X, y, sample_weight=None)
584
+
Fit the model based on the given training data.
585
+
586
+
decision_function(X)
587
+
The decision function evaluated on the given dataset.
588
+
589
+
Notes
590
+
-----
591
+
The `plqERM_ElasticNet` class is a subclass of `_BaseReHLine` and `BaseEstimator`, which suggests that it is part of a larger framework for implementing ReHLine algorithms.
592
+
593
+
594
+
595
+
Overview
596
+
========
597
+
598
+
599
+
.. list-table:: Methods
600
+
:header-rows: 0
601
+
:widths: auto
602
+
:class: summarytable
603
+
604
+
* - :py:obj:`fit <rehline.plqERM_ElasticNet.fit>`\ (X, y, sample_weight)
0 commit comments