ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.05041
  4. Cited By
A Modified Bayesian Optimization based Hyper-Parameter Tuning Approach
  for Extreme Gradient Boosting

A Modified Bayesian Optimization based Hyper-Parameter Tuning Approach for Extreme Gradient Boosting

10 April 2020
Sayan Putatunda
Kiran Rama
ArXivPDFHTML

Papers citing "A Modified Bayesian Optimization based Hyper-Parameter Tuning Approach for Extreme Gradient Boosting"

6 / 6 papers shown
Title
Sequential Large Language Model-Based Hyper-parameter Optimization
Sequential Large Language Model-Based Hyper-parameter Optimization
Kanan Mahammadli
Seyda Ertekin
171
5
0
27 Oct 2024
Tunability: Importance of Hyperparameters of Machine Learning Algorithms
Tunability: Importance of Hyperparameters of Machine Learning Algorithms
Philipp Probst
B. Bischl
A. Boulesteix
60
612
0
26 Feb 2018
Fast Bayesian Optimization of Machine Learning Hyperparameters on Large
  Datasets
Fast Bayesian Optimization of Machine Learning Hyperparameters on Large Datasets
Aaron Klein
Stefan Falkner
Simon Bartels
Philipp Hennig
Frank Hutter
AI4CE
74
549
0
23 May 2016
XGBoost: A Scalable Tree Boosting System
XGBoost: A Scalable Tree Boosting System
Tianqi Chen
Carlos Guestrin
718
38,735
0
09 Mar 2016
An ensemble-based system for automatic screening of diabetic retinopathy
An ensemble-based system for automatic screening of diabetic retinopathy
B. Antal
András Hajdu
81
237
0
30 Oct 2014
Practical Bayesian Optimization of Machine Learning Algorithms
Practical Bayesian Optimization of Machine Learning Algorithms
Jasper Snoek
Hugo Larochelle
Ryan P. Adams
335
7,923
0
13 Jun 2012
1