ADVERTISEMENT

Meta's latest breakthroughs

Published Sep 30, 2024 07:43 am

In the recently concluded 2-day Meta Connect 2024 event, Meta unveiled a wave of exciting innovations, once again pushing the boundaries in virtual reality (VR), augmented reality (AR) and artificial intelligence (AI).  

Formerly known as Oculus Connect and rebranded to Meta Connect in 2021, Meta has evolved its vision toward the Metaverse, broadening its scope to include not only VR but also AR, AI, and other technologies.   It continues to serve as a platform for Meta to announce its latest hardware, software, and AI developments for both developers and consumers.

 

At the Meta Connect 2024, several advancements were unveiled.  Below is a summary of their key announcements:


 

metaquest3s.jpg

 

Meta Quest 3S is a budget-friendly VR headset.   It features Fresnel lenses offering lower resolution (1,832 x 1,920), and starts with 128GB storage.  The Meta Quest 3S features the same Quest 3 capabilities; but this time at a lower cost making VR technology more accessible to many.

 

Meta rayban.jpg

 

Ray-ban Smart Glasses is Meta's updated version of its smart glasses.  In partnership with Ray-Ban, these glasses feature improved cameras, enhanced audio, and the ability to integrate AI-powered live translation and object recognition.
 

AI Chatbots with celebrity voices including John Cena and Kristen Bell were introduced enhancing Meta's AI capabilities.  These chatbots can now handle more conversational interactions and edit photos through text prompts on Meta platforms.

 

Meta AI is Meta's new AI assistant available across platforms like WhatsApp, Instagram, and Messenger.  In addition, the event showcased their generative AI tool, which can create images and stickers in seconds based on text prompts enhancing creativity across Meta’s apps. AI Studio was likewise introduced, allowing users to create their custom AI personalities.
 

Meta LLaMA 3.2 AI positions its AI in competition with other advanced AI platforms like ChatGPT.  This AI model has vision capabilities that enables it to analyze and describe images.

 

orion holographic avatar.webp

 

A preview of the Orion Holographic Avatar was presented and it is a new AR feature that enables lifelike holographic avatars for more immersive communication in virtual environments.
 

So there you have it.  Through these advancements, Meta demonstrates how AR, VR, and AI are becoming increasingly relevant as they integrate into our daily lives, making sophisticated technology accessible to everyone.

ADVERTISEMENT
.most-popular .layout-ratio{ padding-bottom: 79.13%; } @media (min-width: 768px) and (max-width: 1024px) { .widget-title { font-size: 15px !important; } }

{{ articles_filter_1561_widget.title }}

.most-popular .layout-ratio{ padding-bottom: 79.13%; } @media (min-width: 768px) and (max-width: 1024px) { .widget-title { font-size: 15px !important; } }

{{ articles_filter_1562_widget.title }}

.most-popular .layout-ratio{ padding-bottom: 79.13%; } @media (min-width: 768px) and (max-width: 1024px) { .widget-title { font-size: 15px !important; } }

{{ articles_filter_1563_widget.title }}

{{ articles_filter_1564_widget.title }}

.mb-article-details { position: relative; } .mb-article-details .article-body-preview, .mb-article-details .article-body-summary{ font-size: 17px; line-height: 30px; font-family: "Libre Caslon Text", serif; color: #000; } .mb-article-details .article-body-preview iframe , .mb-article-details .article-body-summary iframe{ width: 100%; margin: auto; } .read-more-background { background: linear-gradient(180deg, color(display-p3 1.000 1.000 1.000 / 0) 13.75%, color(display-p3 1.000 1.000 1.000 / 0.8) 30.79%, color(display-p3 1.000 1.000 1.000) 72.5%); position: absolute; height: 200px; width: 100%; bottom: 0; display: flex; justify-content: center; align-items: center; padding: 0; } .read-more-background a{ color: #000; } .read-more-btn { padding: 17px 45px; font-family: Inter; font-weight: 700; font-size: 18px; line-height: 16px; text-align: center; vertical-align: middle; border: 1px solid black; background-color: white; } .hidden { display: none; }
function initializeAllSwipers() { // Get all hidden inputs with cms_article_id document.querySelectorAll('[id^="cms_article_id_"]').forEach(function (input) { const cmsArticleId = input.value; const articleSelector = '#article-' + cmsArticleId + ' .body_images'; const swiperElement = document.querySelector(articleSelector); if (swiperElement && !swiperElement.classList.contains('swiper-initialized')) { new Swiper(articleSelector, { loop: true, pagination: false, navigation: { nextEl: '#article-' + cmsArticleId + ' .swiper-button-next', prevEl: '#article-' + cmsArticleId + ' .swiper-button-prev', }, }); } }); } setTimeout(initializeAllSwipers, 3000); const intersectionObserver = new IntersectionObserver( (entries) => { entries.forEach((entry) => { if (entry.isIntersecting) { const newUrl = entry.target.getAttribute("data-url"); if (newUrl) { history.pushState(null, null, newUrl); let article = entry.target; // Extract metadata const author = article.querySelector('.author-section').textContent.replace('By', '').trim(); const section = article.querySelector('.section-info ').textContent.replace(' ', ' '); const title = article.querySelector('.article-title h1').textContent; // Parse URL for Chartbeat path format const parsedUrl = new URL(newUrl, window.location.origin); const cleanUrl = parsedUrl.host + parsedUrl.pathname; // Update Chartbeat configuration if (typeof window._sf_async_config !== 'undefined') { window._sf_async_config.path = cleanUrl; window._sf_async_config.sections = section; window._sf_async_config.authors = author; } // Track virtual page view with Chartbeat if (typeof pSUPERFLY !== 'undefined' && typeof pSUPERFLY.virtualPage === 'function') { try { pSUPERFLY.virtualPage({ path: cleanUrl, title: title, sections: section, authors: author }); } catch (error) { console.error('ping error', error); } } // Optional: Update document title if (title && title !== document.title) { document.title = title; } } } }); }, { threshold: 0.1 } ); function showArticleBody(button) { const article = button.closest("article"); const summary = article.querySelector(".article-body-summary"); const body = article.querySelector(".article-body-preview"); const readMoreSection = article.querySelector(".read-more-background"); // Hide summary and read-more section summary.style.display = "none"; readMoreSection.style.display = "none"; // Show the full article body body.classList.remove("hidden"); } document.addEventListener("DOMContentLoaded", () => { let loadCount = 0; // Track how many times articles are loaded const offset = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]; // Offset values const currentUrl = window.location.pathname.substring(1); let isLoading = false; // Prevent multiple calls if (!currentUrl) { console.log("Current URL is invalid."); return; } const sentinel = document.getElementById("load-more-sentinel"); if (!sentinel) { console.log("Sentinel element not found."); return; } function isSentinelVisible() { const rect = sentinel.getBoundingClientRect(); return ( rect.top < window.innerHeight && rect.bottom >= 0 ); } function onScroll() { if (isLoading) return; if (isSentinelVisible()) { if (loadCount >= offset.length) { console.log("Maximum load attempts reached."); window.removeEventListener("scroll", onScroll); return; } isLoading = true; const currentOffset = offset[loadCount]; window.loadMoreItems().then(() => { let article = document.querySelector('#widget_1690 > div:nth-last-of-type(2) article'); intersectionObserver.observe(article) loadCount++; }).catch(error => { console.error("Error loading more items:", error); }).finally(() => { isLoading = false; }); } } window.addEventListener("scroll", onScroll); });

Sign up by email to receive news.