<?xml version="1.0" encoding="utf-8" ?> <rss version="2.0" xmlns:opensearch="http://a9.com/-/spec/opensearch/1.1/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"> <channel> <title> <![CDATA[St. Xavier's University Library Search for 'su:&quot;MACHINE LEARNING&quot;']]> </title> <link> /cgi-bin/koha/opac-search.pl?q=ccl=su%3A%22MACHINE%20LEARNING%22&#38;sort_by=relevance&#38;format=rss </link> <atom:link rel="self" type="application/rss+xml" href="/cgi-bin/koha/opac-search.pl?q=ccl=su%3A%22MACHINE%20LEARNING%22&#38;sort_by=relevance&#38;format=rss"/> <description> <![CDATA[ Search results for 'su:&quot;MACHINE LEARNING&quot;' at St. Xavier's University Library]]> </description> <opensearch:totalResults>27</opensearch:totalResults> <opensearch:startIndex>0</opensearch:startIndex> <opensearch:itemsPerPage>50</opensearch:itemsPerPage> <atom:link rel="search" type="application/opensearchdescription+xml" href="/cgi-bin/koha/opac-search.pl?q=ccl=su%3A%22MACHINE%20LEARNING%22&#38;sort_by=relevance&#38;format=opensearchdescription"/> <opensearch:Query role="request" searchTerms="q%3Dccl%3Dsu%253A%2522MACHINE%2520LEARNING%2522" startPage="" /> <item> <title> Machine Learning </title> <dc:identifier>ISBN:9789353066697</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=7260</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9353066697.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Dutt, Saikat .<br /> Noida Pearson 2019 .<br /> xxvii, 428 9789353066697 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=7260">Place hold on <em>Machine Learning </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=7260</guid> </item> <item> <title> Practical time series analysis : prediction with statistics and machine learning </title> <dc:identifier>ISBN:9789352139255</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=7421</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9352139259.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Nielsen, Aileen .<br /> Kolkata Shroff publishers &amp; distributors 2020 .<br /> xvi, 480 , Table of Contents Preface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix 1. Time Series: An Overview and a Quick History. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 The History of Time Series in Diverse Applications 2 Medicine as a Time Series Problem 2 Forecasting Weather 6 Forecasting Economic Growth 7 Astronomy 9 Time Series Analysis Takes Off 10 The Origins of Statistical Time Series Analysis 12 The Origins of Machine Learning Time Series Analysis 13 More Resources 13 2. Finding and Wrangling Time Series Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Where to Find Time Series Data 18 Prepared Data Sets 18 Found Time Series 25 Retrofitting a Time Series Data Collection from a Collection of Tables 26 A Worked Example: Assembling a Time Series Data Collection 27 Constructing a Found Time Series 33 Timestamping Troubles 35 Whose Timestamp? 35 Guesstimating Timestamps to Make Sense of Data 36 What’s a Meaningful Time Scale? 39 Cleaning Your Data 40 Handling Missing Data 40 Upsampling and Downsampling 52 Smoothing Data 55 iiiSeasonal Data 60 Time Zones 63 Preventing Lookahead 67 More Resources 69 3. Exploratory Data Analysis for Time Series. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 Familiar Methods 73 Plotting 74 Histograms 77 Scatter Plots 78 Time Series–Specific Exploratory Methods 81 Understanding Stationarity 82 Applying Window Functions 86 Understanding and Identifying Self-Correlation 91 Spurious Correlations 102 Some Useful Visualizations 104 1D Visualizations 104 2D Visualizations 105 3D Visualizations 113 More Resources 117 4. Simulating Time Series Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 What’s Special About Simulating Time Series? 120 Simulation Versus Forecasting 121 Simulations in Code 121 Doing the Work Yourself 122 Building a Simulation Universe That Runs Itself 128 A Physics Simulation 134 Final Notes on Simulations 140 Statistical Simulations 141 Deep Learning Simulations 141 More Resources 142 5. Storing Temporal Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 Defining Requirements 145 Live Data Versus Stored Data 146 Database Solutions 148 SQL Versus NoSQL 149 Popular Time Series Database and File Solutions 152 File Solutions 157 NumPy 158 Pandas 158 iv | Table of ContentsStandard R Equivalents 158 Xarray 159 More Resources 160 6. Statistical Models for Time Series. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163 Why Not Use a Linear Regression? 163 Statistical Methods Developed for Time Series 166 Autoregressive Models 166 Moving Average Models 181 Autoregressive Integrated Moving Average Models 186 Vector Autoregression 196 Variations on Statistical Models 201 Advantages and Disadvantages of Statistical Methods for Time Series 203 More Resources 204 7. State Space Models for Time Series. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207 State Space Models: Pluses and Minuses 209 The Kalman Filter 210 Overview 210 Code for the Kalman Filter 212 Hidden Markov Models 218 How the Model Works 218 How We Fit the Model 220 Fitting an HMM in Code 224 Bayesian Structural Time Series 229 Code for bsts 230 More Resources 235 8. Generating and Selecting Features for a Time Series. . . . . . . . . . . . . . . . . . . . . . . . . . . . 239 Introductory Example 240 General Considerations When Computing Features 241 The Nature of the Time Series 242 Domain Knowledge 242 External Considerations 243 A Catalog of Places to Find Features for Inspiration 243 Open Source Time Series Feature Generation Libraries 244 Domain-Specific Feature Examples 249 How to Select Features Once You Have Generated Them 252 Concluding Thoughts 255 More Resources 256 Table of Contents | v9. Machine Learning for Time Series. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259 Time Series Classification 260 Selecting and Generating Features 260 Decision Tree Methods 264 Clustering 272 Generating Features from the Data 273 Temporally Aware Distance Metrics 280 Clustering Code 285 More Resources 287 10. Deep Learning for Time Series. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 289 Deep Learning Concepts 292 Programming a Neural Network 294 Data, Symbols, Operations, Layers, and Graphs 294 Building a Training Pipeline 298 Inspecting Our Data Set 299 Steps of a Training Pipeline 302 Feed Forward Networks 318 A Simple Example 318 Using an Attention Mechanism to Make Feed Forward Networks More Time-Aware 321 CNNs 324 A Simple Convolutional Model 325 Alternative Convolutional Models 327 RNNs 330 Continuing Our Electric Example 332 The Autoencoder Innovation 334 Combination Architectures 335 Summing Up 340 More Resources 341 11. Measuring Error. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 343 The Basics: How to Test Forecasts 344 Model-Specific Considerations for Backtesting 347 When Is Your Forecast Good Enough? 348 Estimating Uncertainty in Your Model with a Simulation 350 Predicting Multiple Steps Ahead 353 Fit Directly to the Horizon of Interest 353 Recursive Approach to Distant Temporal Horizons 354 Multitask Learning Applied to Time Series 354 Model Validation Gotchas 355 More Resources 355 vi | Table of Contents12. Performance Considerations in Fitting and Serving Time Series Models. . . . . . . . . . . . 357 Working with Tools Built for More General Use Cases 358 Models Built for Cross-Sectional Data Don’t “Share” Data Across Samples 358 Models That Don’t Precompute Create Unnecessary Lag Between Measuring Data and Making a Forecast 360 Data Storage Formats: Pluses and Minuses 361 Store Your Data in a Binary Format 361 Preprocess Your Data in a Way That Allows You to “Slide” Over It 362 Modifying Your Analysis to Suit Performance Considerations 362 Using All Your Data Is Not Necessarily Better 363 Complicated Models Don’t Always Do Better Enough 363 A Brief Mention of Alternative High-Performance Tools 364 More Resources 365 13. Healthcare Applications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 367 Predicting the Flu 367 A Case Study of Flu in One Metropolitan Area 367 What Is State of the Art in Flu Forecasting? 383 Predicting Blood Glucose Levels 384 Data Cleaning and Exploration 385 Generating Features 390 Fitting a Model 396 More Resources 401 14. Financial Applications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403 Obtaining and Exploring Financial Data 404 Preprocessing Financial Data for Deep Learning 410 Adding Quantities of Interest to Our Raw Values 410 Scaling Quantities of Interest Without a Lookahead 411 Formatting Our Data for a Neural Network 413 Building and Training an RNN 416 More Resources 423 15. Time Series for Government. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 425 Obtaining Governmental Data 426 Exploring Big Time Series Data 428 Upsample and Aggregate the Data as We Iterate Through It 431 Sort the Data 432 Online Statistical Analysis of Time Series Data 436 Remaining Questions 446 Further Improvements 446 More Resources 447 Table of Contents | vii16. Time Series Packages. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 449 Forecasting at Scale 449 Google’s Industrial In-house Forecasting 450 Facebook’s Open Source Prophet Package 452 Anomaly Detection 457 Twitter’s Open Source AnomalyDetection Package 457 Other Time Series Packages 460 More Resources 461 17. Forecasts About Forecasting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463 Forecasting as a Service 463 Deep Learning Enhances Probabilistic Possibilities 464 Increasing Importance of Machine Learning Rather Than Statistics 465 Increasing Combination of Statistical and Machine Learning Methodologies 466 More Forecasts for Everyday Life 466 Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 469 9789352139255 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=7421">Place hold on <em>Practical time series analysis </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=7421</guid> </item> <item> <title> Deep learning / </title> <dc:identifier>ISBN:9780262035613 (hardcover : alk. paper) | 0262035618 (hardcover : alk. paper)</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=8120</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/0262035618.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Goodfellow, Ian,.<br /> Massachusetts Cambridge 2016 .<br /> xxii, 775 pages : , An introduction to a broad range of topics in deep learning, covering mathematical and conceptual background, deep learning techniques used in industry, and research perspectives. “Written by three experts in the field, Deep Learning is the only comprehensive book on the subject.” —Elon Musk, cochair of OpenAI; cofounder and CEO of Tesla and SpaceX Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors. 24 cm..<br /> 9780262035613 (hardcover : alk. paper) | 0262035618 (hardcover : alk. paper) </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=8120">Place hold on <em>Deep learning /</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=8120</guid> </item> <item> <title> Deep learning with Python </title> <dc:identifier>ISBN:9781617296864</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=9718</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/1617296864.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Chollet, Francois .<br /> Shelter Island Manning 2021 .<br /> xxiv, 478 9781617296864 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=9718">Place hold on <em>Deep learning with Python </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=9718</guid> </item> <item> <title> Deep Learning : A practitioner's approach </title> <dc:identifier>ISBN:9789352136049</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=9844</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9352136047.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Patterson, Josh .<br /> New Delhi Shroff publishers &amp; distributors 2021 .<br /> xxi, 507 , Includes index &amp; appendix 9789352136049 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=9844">Place hold on <em>Deep Learning </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=9844</guid> </item> <item> <title> Pattern recognition and machine learning </title> <dc:identifier>ISBN:9781493938438</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=9985</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/1493938436.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Bishop, Christopher M..<br /> New York Springer 2009 .<br /> 738p. , includes index 9781493938438 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=9985">Place hold on <em>Pattern recognition and machine learning</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=9985</guid> </item> <item> <title> Machine learning using R </title> <dc:identifier>ISBN:9789354246111</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=10423</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9354246117.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Kumar Rahul .<br /> New Delhi Wiley 2022 .<br /> xxii, 430 , Includes index 9789354246111 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=10423">Place hold on <em>Machine learning using R </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=10423</guid> </item> <item> <title> The elements of statistical learning : data mining, inference, and prediction / </title> <dc:identifier>ISBN:9780387848570 (hardcover : alk. paper) | 9780387848587 (electronic)</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=10574</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/0387848576.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Hastie, Trevor..<br /> New York, NY : Springer, 2009 .<br /> xxii, 745 p. : , Introduction Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 1-8 Overview of Supervised Learning Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 9-41 Linear Methods for Regression Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 43-99 Linear Methods for Classification Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 101-137 Basis Expansions and Regularization Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 139-189 Kernel Smoothing Methods Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 191-218 Model Assessment and Selection Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 219-259 Model Inference and Averaging Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 261-294 Additive Models, Trees, and Related Methods Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 295-336 Boosting and Additive Trees Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 337-387 Neural Networks Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 389-416 Support Vector Machines and Flexible Discriminants Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 417-458 Prototype Methods and Nearest-Neighbors Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 459-483 Unsupervised Learning Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 485-585 Random Forests Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 587-604 Ensemble Learning Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 605-624 Undirected Graphical Models Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 625-648 High-Dimensional Problems: p N Trevor Hastie, Robert Tibshirani, Jerome Friedman Pages 649-698 Back Matter 25 cm..<br /> 9780387848570 (hardcover : alk. paper) | 9780387848587 (electronic) </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=10574">Place hold on <em>The elements of statistical learning :</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=10574</guid> </item> <item> <title> Machine Learning </title> <dc:identifier>ISBN:9789353066697</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=10908</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9353066697.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Dutt, Saikat .<br /> Noida Pearson 2019 .<br /> xxvii, 428 9789353066697 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=10908">Place hold on <em>Machine Learning </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=10908</guid> </item> <item> <title> Statistical Sherlock: Sleuthing out fake news stories </title> <dc:identifier>ISBN:</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=11237</link> <description> <![CDATA[ <p> By Sen, Ritobrata..<br /> Kolkata St. Xavier's University 2024 .<br /> 76p. , Includes graph </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=11237">Place hold on <em>Statistical Sherlock:</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=11237</guid> </item> <item> <title> Practical machine learning for computer vision : end-to end machine learning for images </title> <dc:identifier>ISBN:9789391043834</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=11717</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9391043836.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Lakshmanan, Valliappa .<br /> Kolkata Shroff publishers &amp; distributors 2021 .<br /> xvi, 463 , Includes index 9789391043834 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=11717">Place hold on <em>Practical machine learning for computer vision </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=11717</guid> </item> <item> <title> Apply data science : introduction, applications and projects </title> <dc:identifier>ISBN:9783658387976</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=12126</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/3658387971.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> Wiesbaden Springer 2023 .<br /> 232p. , includes index 9783658387976 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=12126">Place hold on <em>Apply data science</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=12126</guid> </item> <item> <title> Introduction to Machine Learning with R : Rigorous Mathematical analysis </title> <dc:identifier>ISBN:9789352137251</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=13735</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9352137256.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Burger, Scott V. .<br /> Kolkata Shroff publishers 2018 .<br /> ix, 212 , Table of contents : What is a model? -- Supervised and unsupervised machine learning -- Sampling statistics and model training in R -- Regression in a nutshell -- Neural networks in a nutshell -- Tree-based methods -- Other advanced methods -- Machine learning with the caret package -- Encyclopedia of machine learning models in caret. Includes index 9789352137251 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=13735">Place hold on <em>Introduction to Machine Learning with R </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=13735</guid> </item> <item> <title> Reinforcement Learning : industrial applications of intelligent agents </title> <dc:identifier>ISBN:9789385889509</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=13738</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9385889508.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Winder, Phil.<br /> Kolkata Shroff 2021 .<br /> xxiii, 379 , Includes index, glossary &amp; appendix 9789385889509 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=13738">Place hold on <em>Reinforcement Learning </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=13738</guid> </item> <item> <title> Probabilistic Machine learning for Finance and investing : A primer to generative AI with Python </title> <dc:identifier>ISBN:9789355429995</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=13742</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9355429991.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Kanungo, Deepak K. .<br /> Kolkata Shroff 2023 .<br /> xv, 247 , Includes index 9789355429995 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=13742">Place hold on <em>Probabilistic Machine learning for Finance and investing </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=13742</guid> </item> <item> <title> Machine Learning and Artificial Intelligence </title> <dc:identifier>ISBN:9783031122811</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=13797</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/303112281X.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Joshi, Ameet V. .<br /> Switzerland Springer 2023 .<br /> xxi, 271 p. 24 cm..<br /> 9783031122811 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=13797">Place hold on <em>Machine Learning and Artificial Intelligence</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=13797</guid> </item> <item> <title> Machine learning : a hands-on approach </title> <dc:identifier>ISBN:9788197424984</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=13939</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/8197424985.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Robin, C R Rene.<br /> Kolkata University press 2025 .<br /> 804p. , includes index 9788197424984 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=13939">Place hold on <em>Machine learning</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=13939</guid> </item> <item> <title> Synthetic media : navigating the futur of ai and ml generated content, opportunities, threats and the future of humanity </title> <dc:identifier>ISBN:9789334224290</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=14232</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9334224290.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> Kolkata Mitra 2025 .<br /> 462p. 9789334224290 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=14232">Place hold on <em>Synthetic media</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=14232</guid> </item> <item> <title> Data science and machine learning with r </title> <dc:identifier>ISBN:9789354600333</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=14324</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9354600336.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Thareja, Reema.<br /> Chennai McGraw Hill 2025 .<br /> 472p. , Table of contents Chapters Chapter 1: Introduction to Data Sciences and Machine Learning Chapter 2: Machine Learning Algorithms Chapter 3: Machine Learning Algorithms - II Chapter 4: Introduction to R Chapter 5: More on Data Structures Chapter 6: Decision Control and Looping Statements Chapter 7: Generating and Manipulating Data in R Chapter 8: Working with Data Chapter 9: Using dplyr () and tidyr () packages Chapter 10: Plotting graphs in R Chapter 11: Social Media Mining Chapter 12: Implementing Machine Learning Algorithms Chapter 13: Implementing Machine Learning Algorithms - II Index Online Content Appendices Case Studies includes index 9789354600333 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=14324">Place hold on <em>Data science and machine learning with r</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=14324</guid> </item> <item> <title> Machine learning principles and techniques </title> <dc:identifier>ISBN:9788198761934</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=14334</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/8198761934.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> Kolkata Aryan 2025 .<br /> 524p. , includes index 9788198761934 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=14334">Place hold on <em>Machine learning principles and techniques</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=14334</guid> </item> <item> <title> Computational statistics and machine learning : as per the latest curriculum of the directives of NEP 2020 </title> <dc:identifier>ISBN:9789367615560</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=14337</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9367615566.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Pal, Shilpi.<br /> New Delhi Global net 2025 .<br /> 356p. 9789367615560 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=14337">Place hold on <em>Computational statistics and machine learning</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=14337</guid> </item> <item> <title> Machine learning principles and techniques </title> <dc:identifier>ISBN:9788198761934</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=14355</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/8198761934.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> Kolkata Aryan 2025 .<br /> 524p. , includes index 9788198761934 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=14355">Place hold on <em>Machine learning principles and techniques</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=14355</guid> </item> <item> <title> Machine learning : theory and practice </title> <dc:identifier>ISBN:9789393330697</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=14417</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9393330697.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Murty, M.N..<br /> Kolkata University press 2024 .<br /> xi, 332p. , Preface Acknowledgements List of Acronyms Chapter 1: Introduction to Machine Learning Evolution of Machine Learning | Paradigms for ML | Learning by Rote | Learning by Deduction | Learning by Abduction | Learning by Induction | Reinforcement Learning | Types of Data | Matching | Stages in Machine Learning | Data Acquisition | Feature Engineering | Data Representation | Model Selection | Model Learning | Model Evaluation | Model Prediction | Model Explanation | Search and Learning | Explanation Offered by the Model | Data Sets Used Chapter 2: Nearest Neighbor-Based Models Introduction to Proximity Measures | Distance Measures | Minkowski Distance |Weighted Distance Measure | Non-Metric Similarity Functions | Levenshtein Distance | Mutual Neighborhood Distance (MND) | Proximity Between Binary Patterns | Different Classification Algorithms Based on the Distance Measures | Nearest Neighbor Classifier (NNC) | K-Nearest Neighbor Classifier | Weighted K-Nearest Neighbor (WKNN) Algorithm | Radius Distance Nearest Neighbor Algorithm | Tree-Based Nearest Neighbor Algorithm | Branch and Bound Method | Leader Clustering | KNN Regression | Concentration Effect and Fractional Norms | Performance Measures | Performance of Classifiers | Performance of Regression Algorithms | Area Under the ROC Curve for the Breast Cancer Data Set Chapter 3: Models Based on Decision Trees Introduction to Decision Trees | Decision Trees for Classification | Impurity Measures for Decision Tree Construction | Properties of the Decision Tree Classifier (DTC) | Applications in Breast Cancer Data | Embedded Schemes for Feature Selection | Regression Based on Decision Trees | Bias–Variance Trade-off | Random Forests for Classification and Regression | Comparison of DT and RF Models on Olivetti Face Data | AdaBoost Classifier | Regression Using DT-Based Models | Gradient Boosting (GB) | Practical Application Chapter 4: The Bayes Classifier Introduction to the Bayes Classifier | Probability, Conditional Probability and Bayes’ Rule | Conditional Probability | Total Probability | Bayes’ Rule and Inference | Bayes’ Rule and Classification | Random Variables, Probability Mass Function, Probability Density Function and Cumulative Distribution Function, Expectation and Variance | Random Variables | Probability Mass Function (PMF) | Binomial Random Variable | Cumulative Distribution Function (CDF) | Continuous Random Variables | Expectation of a Random Variable | Variance of a Random Variable | Normal Distribution | The Bayes Classifier and its Optimality | Multi-Class Classification | Parametric and Non-Parametric Schemes for Density Estimation | Parametric Schemes | Class Conditional Independence and Na.ve Bayes Classifier | Estimation of the Probability Structure | Naive Bayes Classifier (NBC) Chapter 5: Machine Learning Based on Frequent Itemsets Introduction to the Frequent Itemset Approach | Frequent Itemsets | Frequent Itemset Generation | Frequent Itemset Generation Strategies | Apriori Algorithm | Frequent Pattern Tree and Variants | FP Tree-Based Frequent Itemset Generation | Pattern Count (PC) Tree-Based Frequent Itemset Generation | Frequent Itemset Generation Using the PC Tree | Dynamic Mining of Frequent Itemsets | Classification Rule Mining | Frequent Itemsets for Classification Using PC Tree | Frequent Itemsets for Clustering Using the PC Tree Chapter 6: Representation Introduction to Representation | Feature Selection | Linear Feature Extraction | Vector Spaces | Basis of a Vector Space | Row Vectors and Column Vectors | Linear Transformations | Eigenvalues and Eigenvectors | Symmetric Matrices | Rank of a Matrix | Principal Component Analysis | Experimental Results on Olivetti Face Data | Singular Value Decomposition | PCA and SVD | Random Projections Chapter 7: Clustering Introduction to Clustering | Partitioning of Data | Data Re-organization | Data Compression | Summarization | Matrix Factorization | Clustering of Patterns | Data Abstraction | Clustering Algorithms | Divisive Clustering | Agglomerative Clustering | Partitional Clustering | K-Means Clustering | K-Means++ Clustering | Soft Partitioning | Soft Clustering | Fuzzy C-Means Clustering | Rough Clustering | Rough K-Means Clustering Algorithm | Expectation Maximization-Based Clustering | Spectral Clustering | Clustering Large Data Sets | Divide-and-Conquer Method Chapter 8: Linear Discriminants for Machine Learning Introduction to Linear Discriminants | Linear Discriminants for Classification | Parameters Involved in the Linear Discriminant Function | Learning w and b | Perceptron Classifier | Perceptron Learning Algorithm | Convergence of the Learning Algorithm | Linearly Non-Separable Classes | Multi-Class Problems | Support Vector Machines | Linearly Non-Separable Case | Non-linear SVM | Kernel Trick | Logistic Regression | Linear Regression | Sigmoid Function | Learning w and b in Logistic Regression | Multi-Layer Perceptrons (MLPs) | Backpropagation for Training an MLP | Results on the Digits Data Set Chapter 9: Deep Learning Introduction to Deep Learning | Non-Linear Feature Extraction Using Autoencoders | Comparison on the Digits Data Set | Deep Neural Networks | Activation Functions | Initializing Weights | Improved Optimization Methods | Adaptive Optimization | Loss Functions | Regularization | Adding Noise to the Output or Label Smoothing | Experimental Results on the MNIST Data Set | Convolutional Neural Networks | Convolution | Padding Zero Rows and Columns | Pooling to Reduce Dimensionality | Recurrent Neural Networks | Training an RNN | Encoder–Decoder Models | Generative Adversarial Networks Conclusions Appendix – Hints to Practical Exercises Index 9789393330697 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=14417">Place hold on <em>Machine learning</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=14417</guid> </item> <item> <title> Machine learning : a hands-on approach </title> <dc:identifier>ISBN:9788197424984</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=14419</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/8197424985.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Robin, C R Rene.<br /> Kolkata University press 2025 .<br /> xiv, 804p. , Chapter 1 Introduction to Machine Learning Introduction | What is Machine Learning? | History of Machine Learning | Role of Machine Learning in Computer Science and Problem Solving | Why Machine Learning? | Adaptivity of Machine Learning | Designing versus Learning | Training versus Testing in Machine Learning | Machine Learning versus Automation | Predictive and Descriptive Tasks in Machine Learning | Some Terminology Related to Machine Learning | Types of Machine Learning | Passive Learning versus Active Learning | Online versus Batch Machine Learning | Differences between Machine Learning Models and Algorithms | Disadvantages of Data-driven Solutions | Well-posed Machine Learning Problems | Designing a Learning System (Life Cycle of Machine Learning) Chapter 2 Probability Theory and Statistics in Machine Learning Introduction | Probability | Probability Theory | Joint, Marginal, and Conditional Probability | Statistics | Key Concepts of Probability Distributions | Probability Distributions | Examples of Probability Distributions | Conditional Distribution | Joint Distribution | Combinatorics | Probability Rules and Axioms | Moment Generating Function | Maximum Likelihood Estimation | Density Functions | Density Estimation | Challenges and Future Directions in Probability and Statistics for Machine Learning Chapter 3 Linear Algebra Introduction | Linear Regression | Matrix Decomposition | Vectors and Matrices | Eigenvalue and Eigenvectors | Norms and Vector Spaces | Optimization | Linear Transformation | Cramer’s Rule | Gaussian Elimination | LU Decomposition | QR Decomposition | Eigen Decomposition | Symmetric Matrices | Orthogonalization | Deep Learning with Linear Algebra Chapter 4 Algorithms and Complex Optimizations Sets, Relations, and Functions | Convex Sets and Convex Functions | Optimization Problems | Convex Optimization | Unconstrained Optimization | Constrained Optimization | Dual Optimization Problems | Dynamic Programming | Sublinear Algorithms | Graphs | Transforms | Information Theory | Manifolds Chapter 5 Computational Learning Theory Introduction | Objectives of Computational Learning Theory | History | Importance of Computational Learning Theory | The Main Methods | Probably Approximately Correct Learning | Complexity Theory of Machine Learning | Mistake-bound Learning Model | Instance-based learning | Lazy and Eager Learning | Generative Learning | Consistent Learning | Worst Case (Online) Learning | Applications of Computational Learning Theory | Evaluation Metrics for Computational Learning Theory | Future Directions in Computational Learning Theory Chapter 6 Machine Learning Models Introduction | Models in Machine Learning | Features | Concept Learning Chapter 7 Unsupervised Learning Introduction | What and Why of Unsupervised Learning | Types of Unsupervised Learning | Markov Models | Hidden Markov Model | Matrix Factorization and Matrix Completion Models | Generative Models | Latent Factor Models | Inference Models | Non-negative Matrix Factorization | Advantages of Unsupervised Learning | Disadvantages of Unsupervised Learning Chapter 8 Supervised Learning: Classification Introduction | K-Nearest Neighbor (KNN) | Decision Trees | Random Forests | Linear Classifiers | Applications of Supervised Learning | Limitations and Challenges of Supervised Learning Chapter 9 Supervised Learning: Regression Introduction | Linear Regression versus Non-Linear Regression | Types of Linear Models | Least Squares Method (LSM) | Multivariate Linear Regression | Nonlinearity and Kernel Methods | Generalized Linear Models | AdaBoost (Adaptive Boosting) | Regularized Regression | Backpropagation | Support Vector Regression | Decision Tree Regression | Random Forest Regression | Neural Network Regression | Multi-layer Propagation | Radial Basis Functions | Splines | Curse of Dimensionality | Interpolations and Basis Functions | Multi-class/Structured Outputs, Ranking Chapter 10 Artificial Neural Networks Introduction to Neural Networks | Introduction to Artificial Neural Networks | Types of Artificial Neural Networks | Other Types of ANNs | Building Neural Network Architectures | Training Neural Networks with Backpropagation | Autoencoders | Applications of ANNs | Future of ANNs Chapter 11 Trends in Machine Learning Reinforcement Learning | Multitask Learning | Online Learning | Sequence Learning | Prediction Learning | Bagging and Boosting in Machine Learning | Trends in Machine Learning Technology Chapter 12 Applications of Machine Learning in Various Industries Real-World Problems Solved by Machine Learning | Applications of Machine Learning in the Retail Industry | Applications of Machine Learning in the Logistics Industry | Applications of Machine Learning in the Manufacturing Industry | Applications of Machine Learning in the Energy and Utilities Industry | Applications of Machine Learning in the Travel Industry | Applications of Machine Learning in the Banking Industry | Applications of Machine Learning in the Finance Industry | Applications of Machine Learning in the Insurance Industry Chapter 13 Machine Learning Programming: Capstone Projects Using Python and R Introduction | Installing Python | The sklearn Package | Anaconda Navigator | Data Operations on the Iris Data Set | Finding Outliers | Removing Outliers | Imputing Null Values | Capping the Outlier Values | Splitting the Data into Training and Testing Data | Training and Evaluating the Model | Regularization Techniques that Prevent Overfitting | Implement Linear Regression | Implement Logistic Regression | Decision Tree Classifier | Implement SVM | Implement PCA | Implement Steepest Descent | Implement Random Forest | Implement Random Search | Implement Naïve Bayes | Implement Single-Layer Perceptron Learning Algorithm | Implement Radial Basis Functions | Implement Linear Classifier | Implement Bayesian Classifier | Implement K-Nearest Neighbor Classifier | Implement Linear Discriminant Analysis | Implement Locality Preserving Projection | Implement Logic Gates without Perceptron Model | Implement Logic Gates with Perceptron Model | Handwritten Classification using CNN | Introduction to R Programming Chapter 14 Machine Learning Programming Using Jupyter Notebook Introduction | Using the Online Interface of Jupyter Notebook | A Python Program that Demonstrates the Use of Data Types | A Python Program that Asks for User Input and Uses Conditional Statements to Respond with Different Outputs | A Python Program that Prints Out a Sequence of Numbers using a for Loop and then Asks the User to Do the Same with a while Loop | A Python Program that Defines a Function to Calculate the Area of a Circle, Given its Radius, and then Calls that Function with Different Values | A Python Program that Creates a List of Items and a Dictionary of Key–Value Pairs, and then Demonstrates How to Access and Modify Elements | A Python Program that Reads a Text File, Counts the Number of Words, and Writes the Result to a New File | A Python Program that Intentionally Raises an Error and then Catches it with a try-except Block, Printing an Informative Message to the User | A Program to Define a Python Class with Attributes and Methods to Demonstrate OOP | A Program to Use the Matplotlib Library to Plot a Graph based on the Given Data Points, and Enhancing the Graph with Labels and a Legend | A Program to Introduce the pandas Library by Creating a DataFrame from a Dictionary, and Performing Basic Data Manipulation Operations such as Sorting and Filtering | Basic Operations Using the Iris Data Set | Iris Data Loading and Visualization | Data Processing | Feature Selection | Classification Algorithm: SVM | Classification Algorithm: Decision Tree | Classification Algorithm: KNN | Classification Algorithm: Logistic Regression | Model Evaluation | Hyperparameter Tuning | Cross-validation | Ensemble Methods | Clustering Analysis | Deep Learning | Deep Learning: Keras Appendix A: Model Course Structure Appendix B: Model Question Papers 9788197424984 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=14419">Place hold on <em>Machine learning</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=14419</guid> </item> <item> <title> Machine learning : aicte recommended text book </title> <dc:identifier>ISBN:9789386173423</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=14599</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9386173425.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Chopra, Rajiv.<br /> New Delhi Khanna 2025 .<br /> xii, 348p. , includes index 9789386173423 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=14599">Place hold on <em>Machine learning</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=14599</guid> </item> <item> <title> Mathematics for Machine Learning </title> <dc:identifier>ISBN:9781009108850</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=14612</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/1009108859.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Deisenroth, Marc Peter .<br /> New Delhi Cambridge University Press 2021 .<br /> xvii, 371 , Includes index &amp; reference 9781009108850 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=14612">Place hold on <em>Mathematics for Machine Learning </em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=14612</guid> </item> <item> <title> Deep learning : algorithms and applications </title> <dc:identifier>ISBN:9789366606606</dc:identifier> <link>/cgi-bin/koha/opac-detail.pl?biblionumber=14625</link> <description> <![CDATA[ <img src="https://images-na.ssl-images-amazon.com/images/P/9366606601.01.TZZZZZZZ.jpg" alt="" /> ]]> <![CDATA[ <p> By Ramalakshmi, R..<br /> Delhi Cengage 2026 .<br /> various pages , Part I Foundations 1. Introduction to Deep Learning 2. Machine Learning Fundamentals 3. Mathematical Building Blocks Part II Deep Learning Models 4. Artificial Neural Network (ANN) 5. Convolutional Neural Network (CNN) 6. Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) 7. Gated Recurrent Unit (GRU) and Generative Adversarial Networks (GANs) 8. Optimization Algorithms and Regularization Techniques Part III Advanced Techniques in Deep Learning 9. Auto Encoders, Attention Mechanisms, and Transformers 10. Reinforcement Learning (RL) and Deep Q-Network (DQN) 11. Neural Architecture Search (NAS) and Automated Machine Learning (AutoML) Part IV Deep Learning Frameworks and Tools 12. Popular Deep Learning Frameworks and Libraries Part V Ethical and Social Implications of Deep Learning 13. Bias, Fairness, Data Protection, and Ethical Challenges in DL Models Part VI Deep Learning Applications and Future Trends 14. DL Applications in Computer Vision, NLP, Recommender Systems, and Time-series 15. Challenges, Opportunities and Future Research Trends 9789366606606 </p> ]]> <![CDATA[ <p> <a href="/cgi-bin/koha/opac-reserve.pl?biblionumber=14625">Place hold on <em>Deep learning</em></a> </p> ]]> </description> <guid>/cgi-bin/koha/opac-detail.pl?biblionumber=14625</guid> </item> </channel> </rss>
