mars.learn.contrib.xgboost.XGBRegressor#
- class mars.learn.contrib.xgboost.XGBRegressor(max_depth: Optional[int] = None, max_leaves: Optional[int] = None, max_bin: Optional[int] = None, grow_policy: Optional[str] = None, learning_rate: Optional[float] = None, n_estimators: int = 100, verbosity: Optional[int] = None, objective: Optional[Union[str, Callable[[ndarray, ndarray], Tuple[ndarray, ndarray]]]] = None, booster: Optional[str] = None, tree_method: Optional[str] = None, n_jobs: Optional[int] = None, gamma: Optional[float] = None, min_child_weight: Optional[float] = None, max_delta_step: Optional[float] = None, subsample: Optional[float] = None, sampling_method: Optional[str] = None, colsample_bytree: Optional[float] = None, colsample_bylevel: Optional[float] = None, colsample_bynode: Optional[float] = None, reg_alpha: Optional[float] = None, reg_lambda: Optional[float] = None, scale_pos_weight: Optional[float] = None, base_score: Optional[float] = None, random_state: Optional[Union[int, RandomState]] = None, missing: float = nan, num_parallel_tree: Optional[int] = None, monotone_constraints: Optional[Union[Dict[str, int], str]] = None, interaction_constraints: Optional[Union[str, Sequence[Sequence[str]]]] = None, importance_type: Optional[str] = None, gpu_id: Optional[int] = None, validate_parameters: Optional[bool] = None, predictor: Optional[str] = None, enable_categorical: bool = False, max_cat_to_onehot: Optional[int] = None, eval_metric: Optional[Union[str, List[str], Callable]] = None, early_stopping_rounds: Optional[int] = None, callbacks: Optional[List[TrainingCallback]] = None, **kwargs: Any)[source]#
Implementation of the scikit-learn API for XGBoost regressor.
- __init__(max_depth: Optional[int] = None, max_leaves: Optional[int] = None, max_bin: Optional[int] = None, grow_policy: Optional[str] = None, learning_rate: Optional[float] = None, n_estimators: int = 100, verbosity: Optional[int] = None, objective: Optional[Union[str, Callable[[ndarray, ndarray], Tuple[ndarray, ndarray]]]] = None, booster: Optional[str] = None, tree_method: Optional[str] = None, n_jobs: Optional[int] = None, gamma: Optional[float] = None, min_child_weight: Optional[float] = None, max_delta_step: Optional[float] = None, subsample: Optional[float] = None, sampling_method: Optional[str] = None, colsample_bytree: Optional[float] = None, colsample_bylevel: Optional[float] = None, colsample_bynode: Optional[float] = None, reg_alpha: Optional[float] = None, reg_lambda: Optional[float] = None, scale_pos_weight: Optional[float] = None, base_score: Optional[float] = None, random_state: Optional[Union[int, RandomState]] = None, missing: float = nan, num_parallel_tree: Optional[int] = None, monotone_constraints: Optional[Union[Dict[str, int], str]] = None, interaction_constraints: Optional[Union[str, Sequence[Sequence[str]]]] = None, importance_type: Optional[str] = None, gpu_id: Optional[int] = None, validate_parameters: Optional[bool] = None, predictor: Optional[str] = None, enable_categorical: bool = False, max_cat_to_onehot: Optional[int] = None, eval_metric: Optional[Union[str, List[str], Callable]] = None, early_stopping_rounds: Optional[int] = None, callbacks: Optional[List[TrainingCallback]] = None, **kwargs: Any) None [source]#
Methods
__init__
([max_depth, max_leaves, max_bin, ...])apply
(X[, ntree_limit, iteration_range])Return the predicted leaf every tree for each sample.
evals_result
()Return the evaluation results.
fit
(X, y[, sample_weight, base_margin, ...])Fit the regressor. :param X: Feature matrix :type X: array_like :param y: Labels :type y: array_like :param sample_weight: instance weights :type sample_weight: array_like :param eval_set: A list of (X, y) tuple pairs to use as validation sets, for which metrics will be computed. Validation metrics will help us track the performance of the model. :type eval_set: list, optional :param sample_weight_eval_set: A list of the form [L_1, L_2, ..., L_n], where each L_i is a list of group weights on the i-th validation set. :type sample_weight_eval_set: list, optional.
get_booster
()Get the underlying xgboost Booster of this model.
get_num_boosting_rounds
()Gets the number of xgboost boosting rounds.
get_params
([deep])Get parameters.
get_xgb_params
()Get xgboost specific parameters.
load_model
(fname)Load the model from a file or bytearray.
predict
(data, **kw)Predict with data.
save_model
(fname)Save the model to a file.
set_params
(**params)Set the parameters of this estimator.
Attributes
best_iteration
The best iteration obtained by early stopping.
best_ntree_limit
best_score
The best score obtained by early stopping.
coef_
Coefficients property
feature_importances_
Feature importances property, return depends on importance_type parameter.
feature_names_in_
Names of features seen during
fit()
.intercept_
Intercept (bias) property
n_features_in_
Number of features seen during
fit()
.