Loading Events

Other Event

Learn-to-Rank-Ads in Computational Advertising

Jianchang (JC) MaoHead of Advertising ServicesYahoo! Labs
SHARE:

Hundreds of millions of internet users have been enjoying a plethora of free web services, ranging from search, email, news, sports, finance, and video, to various social network services. Most of these free services are fueled by online advertising, a multi-billion dollar industry. Yet, online advertising spending is still about 10% of the global advertising market of nearly half a trillion dollars. As more users spend more time online, advertisers are moving more budgets to online advertising. The rapid growth of online advertising has created enormous opportunities as well as technical challenges that demand computational intelligence. Computational Advertising has emerged as a new interdisciplinary field in solving challenging problems that rise in online advertising. The central problem of computational advertising is to find the best matching ads from a large ad inventory to a user in a given context (e.g., query, page view) under certain business constraints (blocking, targeting, guaranteed delivery, etc).

In the first part of this talk, I will provide a brief introduction to various forms of online advertising, including search advertising, contextual advertising, guaranteed and non-guaranteed display advertising. For each form of online advertising, I will describe the problem formulation and its accompanying computational challenges.

In the second part of this talk, I will provide a case study on Learn-to-Rank-Ads to illustrate how machine learning techniques can be employed to attack the central problem in computational advertising. Learning to rank has attracted attention of many machine learning researchers in the last decade. A number of Learn-to-Rank algorithms have been proposed in the literature. Until recently, most learn-to-rank algorithms were not using a loss function related to popular relevance measures, such as NDCG (Normalized Discounted Cumulative Gain) and MAP (Mean Average Precision). The main difficulty in direct optimization of these measures is that they depend on the ranks of objects, not the numerical values output by a ranking function. We propose a fully Bayesian framework that addresses this challenge by optimizing the expectation of NDCG measure over all the possible permutations of objects. A relaxation strategy is used to approximate the expectation of NDCG over the space of permutation, and a bound optimization approach is employed to make the computation efficient. Extensive experiments show that the proposed algorithm outperforms state-of-the-art Learn-to-Rank algorithms on several benchmark data sets.
Dr. Jianchang (JC) Mao is currently the head of Advertising Sciences in Y! Labs, responsible for the R&D of Search Advertising, Contextual Advertising, Display Advertising, Targeting, and Categorization technologies and products. He was also a Science/Engineering director responsible for development of backend technologies for several Yahoo! Social Search products, including Y! Answers and Y! MyWeb (Social Bookmarks). Prior to joining Yahoo!, Dr. Mao was Director of Emerging Technologies & Principal Architect at Verity Inc., a leader in Enterprise Search (acquired by Autonomy), from 2000 to 2004. Prior to this, Dr. Mao was a research staff member at the IBM Almaden Research Center from 1994 to 2000. Dr. Mao's research interest includes Machine Learning, Data Mining, Information Retrieval, Computational Advertising, Social Networks, Pattern Recognition and Image Processing. He received an Honorable Mention Award in ACM KDD Cup 2002, IEEE Transactions on Neural Networks Outstanding Paper Award in 1996, and Honorable Mention Award from the International Pattern Recognition Society in 1993. Dr. Mao served as an associate editor of the IEEE Transactions on Neural Networks, 1999-2000. He received his Ph.D. degree in Computer Science from Michigan State University in 1994.

Sponsored by

STIET