Bandit Algorithms for Website Optimization
John Myles White
Sold by Rarewaves USA United, OSWEGO, IL, U.S.A.
AbeBooks Seller since 20 June 2025
New - Soft cover
Condition: New
Quantity: Over 20 available
Add to basketSold by Rarewaves USA United, OSWEGO, IL, U.S.A.
AbeBooks Seller since 20 June 2025
Condition: New
Quantity: Over 20 available
Add to basketThis book shows you how to run experiments on your website using A/B testing - and then takes you a huge step further by introducing you to bandit algorithms for website optimization. Author John Myles White shows you how this family of algorithms can help you boost website traffic, convert visitors to customers, and increase many other measures of success. This is the first developer-focused book on bandit algorithms, which have previously only been described in research papers. You'll learn about several simple algorithms you can deploy on your own websites to improve your business including the epsilon-greedy algorithm, the UCB algorithm and a contextual bandit algorithm. All of these algorithms are implemented in easy-to-follow Python code and be quickly adapted to your business's specific needs. You'll also learn about a framework for testing and debugging bandit algorithms using Monte Carlo simulations, a technique originally developed by nuclear physicists during World War II. Monte Carlo techniques allow you to decide whether A/B testing will work for your business needs or whether you need to deploy a more sophisticated bandits algorithm.
Seller Inventory # LU-9781449341336
When looking for ways to improve your website, how do you decide which changes to make? And which changes to keep? This concise book shows you how to use Multiarmed Bandit algorithms to measure the real-world value of any modifications you make to your site. Author John Myles White shows you how this powerful class of algorithms can help you boost website traffic, convert visitors to customers, and increase many other measures of success.
This is the first developer-focused book on bandit algorithms, which were previously described only in research papers. You’ll quickly learn the benefits of several simple algorithms—including the epsilon-Greedy, Softmax, and Upper Confidence Bound (UCB) algorithms—by working through code examples written in Python, which you can easily adapt for deployment on your own website.
Learn the basics of A/B testing—and recognize when it’s better to use bandit algorithms Develop a unit testing framework for debugging bandit algorithms Get additional code examples written in Julia, Ruby, and JavaScript with supplemental online materials"About this title" may belong to another edition of this title.
Please note that we do not offer Priority shipping to any country.
We currently do not ship to the below countries:
Afghanistan
Bhutan
Brazil
Brunei Darussalam
Channel Islands
Chile
Israel
Lao
Mexico
Russian Federation
Saudi Arabia
South Africa
Yemen
Please do not attempt to place orders with any of these countries as a ship to address - they will be cancelled.