ADVERTISEMENT

Crowdsource by Google: Build Better Products for Everyone with Machine Learning

Published Dec 18, 2020 02:14 pm

With AI powering some of the most magical experiences in one’s favorite apps and products, it is important to ensure that AI systems are able to recognize and understand a diverse set of language and culture. Google launched Crowdsource to empower people everywhere to help train Google’s own machine learning models directly. In addition to that, open sourced data via Crowdsource enables the developer and research community at large to advance the state of the art of technology, with diversity in mind. This free platform, which is available for app download in both Google Play and Android as well as through the web browser, aims to bring diversity and inclusiveness across Google’s AI systems.

Using gamified activities, Crowdsource by Google, allows users to answer simple, fun questions that will generate diverse and more inclusive training data for ML models. The answers from millions of users around the world will help  to make their favorite Google products such as Maps, Photos, Translate, and Assistant even more delightful as they become inclusive of one’s language, region, and culture. 

Early next year, Crowdsource will host a free online workshop, called “Explore ML with Crowdsource”, for anyone who is interested to learn about machine learning. Anyone above 18 years of age, with or without a tech background, is welcome to learn about the basics of Machine Learning and understand how it powers some of their favorite Google products and services. They are encouraged to sign up by February 28, 2021. 

Crowdsource influencers who have been trained by engineers and domain experts at Google will be conducting this free online workshop. They will be helping build the participants’ foundation on Machine Learning and Neural Networks, complemented with hands-on activities for a practical application of the learnings to have a better grasp of the module.

Machine learning for everyone

Through Crowdsource, anybody can be a Contributor and be part of the community based on one’s interests and background. Language enthusiasts can work on translation activities, those interested in photography can tackle Smart Camera or Image Capture, while users who love looking at different pictures from around the globe can take on Image Label Verification. There is also Handwriting Recognition where users transcribe handwriting samples and help create training data for the models that power handwriting recognition for products like Gboard.

Community members can also answer the Image Captions activity where they can improve technology behind accessibility features like Image Description in Chrome browser. Sentiment evaluation is also available to help AI systems better understand YouTube and Google Play Store reviews. All contributions and answers are used in Machine Learning-based products to make them work well for the diversity of the global population.

Interested volunteers can easily become a Crowdsource contributor by simply downloading the app or signing up through the website. Those with a strong passion for community building can choose to level it up and become a Crowdsource influencer to gain access to exclusive Google-led learning sessions and events where they can hone their soft skills and sharpen their technical knowledge on the basics of ML and AI. 

ADVERTISEMENT
.most-popular .layout-ratio{ padding-bottom: 79.13%; } @media (min-width: 768px) and (max-width: 1024px) { .widget-title { font-size: 15px !important; } }

{{ articles_filter_1561_widget.title }}

.most-popular .layout-ratio{ padding-bottom: 79.13%; } @media (min-width: 768px) and (max-width: 1024px) { .widget-title { font-size: 15px !important; } }

{{ articles_filter_1562_widget.title }}

.most-popular .layout-ratio{ padding-bottom: 79.13%; } @media (min-width: 768px) and (max-width: 1024px) { .widget-title { font-size: 15px !important; } }

{{ articles_filter_1563_widget.title }}

{{ articles_filter_1564_widget.title }}

.mb-article-details { position: relative; } .mb-article-details .article-body-preview, .mb-article-details .article-body-summary{ font-size: 17px; line-height: 30px; font-family: "Libre Caslon Text", serif; color: #000; } .mb-article-details .article-body-preview iframe , .mb-article-details .article-body-summary iframe{ width: 100%; margin: auto; } .read-more-background { background: linear-gradient(180deg, color(display-p3 1.000 1.000 1.000 / 0) 13.75%, color(display-p3 1.000 1.000 1.000 / 0.8) 30.79%, color(display-p3 1.000 1.000 1.000) 72.5%); position: absolute; height: 200px; width: 100%; bottom: 0; display: flex; justify-content: center; align-items: center; padding: 0; } .read-more-background a{ color: #000; } .read-more-btn { padding: 17px 45px; font-family: Inter; font-weight: 700; font-size: 18px; line-height: 16px; text-align: center; vertical-align: middle; border: 1px solid black; background-color: white; } .hidden { display: none; }
function initializeAllSwipers() { // Get all hidden inputs with cms_article_id document.querySelectorAll('[id^="cms_article_id_"]').forEach(function (input) { const cmsArticleId = input.value; const articleSelector = '#article-' + cmsArticleId + ' .body_images'; const swiperElement = document.querySelector(articleSelector); if (swiperElement && !swiperElement.classList.contains('swiper-initialized')) { new Swiper(articleSelector, { loop: true, pagination: false, navigation: { nextEl: '#article-' + cmsArticleId + ' .swiper-button-next', prevEl: '#article-' + cmsArticleId + ' .swiper-button-prev', }, }); } }); } setTimeout(initializeAllSwipers, 3000); const intersectionObserver = new IntersectionObserver( (entries) => { entries.forEach((entry) => { if (entry.isIntersecting) { const newUrl = entry.target.getAttribute("data-url"); if (newUrl) { history.pushState(null, null, newUrl); let article = entry.target; // Extract metadata const author = article.querySelector('.author-section').textContent.replace('By', '').trim(); const section = article.querySelector('.section-info ').textContent.replace(' ', ' '); const title = article.querySelector('.article-title h1').textContent; // Parse URL for Chartbeat path format const parsedUrl = new URL(newUrl, window.location.origin); const cleanUrl = parsedUrl.host + parsedUrl.pathname; // Update Chartbeat configuration if (typeof window._sf_async_config !== 'undefined') { window._sf_async_config.path = cleanUrl; window._sf_async_config.sections = section; window._sf_async_config.authors = author; } // Track virtual page view with Chartbeat if (typeof pSUPERFLY !== 'undefined' && typeof pSUPERFLY.virtualPage === 'function') { try { pSUPERFLY.virtualPage({ path: cleanUrl, title: title, sections: section, authors: author }); } catch (error) { console.error('ping error', error); } } // Optional: Update document title if (title && title !== document.title) { document.title = title; } } } }); }, { threshold: 0.1 } ); function showArticleBody(button) { const article = button.closest("article"); const summary = article.querySelector(".article-body-summary"); const body = article.querySelector(".article-body-preview"); const readMoreSection = article.querySelector(".read-more-background"); // Hide summary and read-more section summary.style.display = "none"; readMoreSection.style.display = "none"; // Show the full article body body.classList.remove("hidden"); } document.addEventListener("DOMContentLoaded", () => { let loadCount = 0; // Track how many times articles are loaded const offset = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]; // Offset values const currentUrl = window.location.pathname.substring(1); let isLoading = false; // Prevent multiple calls if (!currentUrl) { console.log("Current URL is invalid."); return; } const sentinel = document.getElementById("load-more-sentinel"); if (!sentinel) { console.log("Sentinel element not found."); return; } function isSentinelVisible() { const rect = sentinel.getBoundingClientRect(); return ( rect.top < window.innerHeight && rect.bottom >= 0 ); } function onScroll() { if (isLoading) return; if (isSentinelVisible()) { if (loadCount >= offset.length) { console.log("Maximum load attempts reached."); window.removeEventListener("scroll", onScroll); return; } isLoading = true; const currentOffset = offset[loadCount]; window.loadMoreItems().then(() => { let article = document.querySelector('#widget_1690 > div:nth-last-of-type(2) article'); intersectionObserver.observe(article) loadCount++; }).catch(error => { console.error("Error loading more items:", error); }).finally(() => { isLoading = false; }); } } window.addEventListener("scroll", onScroll); });

Sign up by email to receive news.