Quantcast
Jump to content


Recommended Posts

Posted

smartthings-analytics-blog-banner-091124

Engagement Data to Guide Product Development

What is SmartThings Analytics?

With our recent update, once you become Works with SmartThings certified, you can use SmartThings Analytics to gather useful data about how your products are being used in the field with PII-safe engagement data. The best part: you can see information like the number of registered and active devices, where your customers are located, and the SmartThings Capabilities that are most commonly used — which helps inform how customers interact with your products.


Leverage these insights to optimize your products and product roadmap.

“In the past, we were unable to obtain engagement data after purchase, but now, the Analytics Dashboard has helped us create actionable insights.”

Maude Shen, Software Product Manager, WiZ Connected



SmartThings_Analytics_V2.png



Why Should Developers and Product Teams Use SmartThings Analytics?

We are always looking for ways to better connect partners with our millions of users.


We know that it can be difficult to collect user engagement data — especially for partners who manufacture Hub Connected devices.


Last year, we announced our first version of Analytics, which included Registered and Active devices. After obtaining Works with SmartThings certification, partners can easily access SmartThings Analytics from the Console.


Since receiving positive feedback on the first version of Analytics, we’ve continued to improve performance and expand the type of data partners can access. New this year, we are providing both country-level data and Capability level use. With Capability data, you can answer questions about the most commonly used features of your product, like “Are users actually changing the color of their lights?”


Leverage these insights to drive better outcomes for your products and users, and optimize your products based on real user interactions.



Key Features of SmartThings Analytics

  • Real-world Usage Data: SmartThings Analytics makes it easy to know if your users are engaged with your products by providing real-world usage data, which we’ve built in a way that gives you actionable data while still protecting the privacy of users’ data.

  • User-friendly Interface: With an intuitive and user-friendly interface, the Analytics tool ensures a seamless experience for product owners and developers of all levels.

  • Multiple Ways to Gain Insights: Easily search by an individual product or your entire catalog, within one country or within a region. Or choose to go back as far as 7, 30, 90 days, or with a custom date range. Lastly, see which SmartThings "Capabilities" are being used — there are plenty of ways to get actionable engagement data to drive product decisions.


Product Questions that SmartThings Analytics Answers


  • Where am I getting most of my product sales? In which countries do I have the most active user base?
  • What features / SmartThings Capabilities do my customers use the most? Which are used the least?
  • How many active devices do I have compared to registered devices?
  • How are the active devices and registered devices trending over time?
  • Did device registrations go up after our recent marketing campaign? Or after the event where we showcased our brand?

How to Get Started with SmartThings Analytics?

Ready to view Analytics for your device(s)? Here's a quick resource guide to help you get started and show you the features:



1. Begin by Accessing Analytics

Analytics is only available to partners who have Works with SmartThings (WWST) certified products.

Visit the SmartThings Developer Console and navigate to the Analytics section, accessible at SmartThings Developer Console.

SmartThings%20Certification%20Console.pn


Keep reading for a screen-by-screen walkthrough, or check out this video.



If you have a WWST certified product(s) , but do not have access to the Console, email us at [email protected].



Unlock_SmartThings_Analytics.png





2. Set Up Your Search Parameters


With Analytics you can search by:


Product(s): Search by one product or all to see a portfolio view of how your products are being used.
SmartThings_Analytics_Product_Search.jpg



Date: See the last 7, 30, or 90 days of data, or choose a custom data range.
Text



Location: View data by county or region.

SmartThings_Analytics_Country_Region_Sel





3. View Data for Insights

Registered Devices: View devices that are registered with SmartThings.

SmartThings_Analytics_Registered_Devices



Active Devices: View devices with an event or an online status in the last 24 hours. Devices may go from the inactive state to the active state and vice versa.

SmartThings_Analytics_Active_Devices.jpg



Capabilities: See SmartThings Capabilities that are being used for your product or products. Get a glimpse of which capabilities are the most used.

SmartThings_Capabilities_Hover.jpg



Geo Location: View by country or region.

SmartThings_Analytics_Countries_Map.jpg


See where your products are being used by country or by region.


Any of these fields can be made full screen or zoomed in on. The data can be also exported as a CSV.

SmartThings_Analytics_Export_Zoom_Small.



Our latest update with Analytics represents another avenue for providing more value for WWST Certified partners. Access to this variety of usage data is critical to understanding how your products are being used.

Ready to gain more product insights? Navigate to Analytics in the Certification Console to see these insights and apply them to your product strategy.

Want to integrate your device with SmartThings? Visit https://developer.smartthings.com to access tools like Edge Builder and Test Suite, and then leverage our Certification Console to get your device(s) certified.

View the full blog at its source



  • Replies 0
  • Created
  • Last Reply

Top Posters In This Topic

Popular Days

Top Posters In This Topic

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Similar Topics

    • By Samsung Newsroom
      The Galaxy Watch has a built-in accelerometer sensor that measures movement or acceleration forces in three dimensions (X,Y, and Z axes). This data is commonly used for tracking movement, detecting gestures, and enabling fitness-related features like sleep tracking, fall detection, step counting, running, and workout tracking.
      The accelerometer measures acceleration along three axes:
      X-axis: Side-to-side movement.
      Y-axis: Forward-and-backward movement.
      Z-axis: Up-and-down movement.
      Figure 1: Axis directions for the accelerometer sensor

      Acceleration is typically measured in meters per second squared (m/s²) or gravity units (g), where 1g = 9.81 m/s².
      This article describes how to read accelerometer sensor data from a Galaxy Watch running Wear OS powered by Samsung and also shows the conversion procedure for the raw data.
      Environment Setup
      Android Studio IDE is used for developing Wear OS applications. The examples in this article use Java, but Kotlin can also be used. Going forward, this article assumes you have already installed the latest Android Studio version on your PC.
      Read Accelerometer Data from Galaxy Watch
      To get accelerometer data, we need to use Android Sensor APIs from the SensorManager library.
      To retrieve accelerometer data from your Galaxy Watch:
      Create a new Wear OS project in Android Studio by selecting File > New Project > Wear OS > Empty Activity > Finish. Set the minimum SDK version to API 30 or higher.
      Add permission to access the sensor into the manifest file (AndroidManifest.xml):
      <uses-feature android:name="android.hardware.sensor.accelerometer" /> You do not need to manually set the runtime permission to access the accelerometer. This permission is granted by default.
      Design your preferred layout (.xml file) to show accelerometer data on the Galaxy Watch screen. This example uses three TextViews in a Constraint Layout to show the output of the three axes of the sensor. You can also check the result in the Logcat window in Android Studio. <TextView android:id="@+id/textViewX" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_marginTop="8dp" android:text="X" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintHorizontal_bias="0.207" app:layout_constraintStart_toStartOf="parent" app:layout_constraintTop_toBottomOf="@+id/textView2" /> For more detailed code, check the sample application.
      Use the SensorManager library and SensorEventListener to read accelerometer data. To implement them: Initialize the SensorManager library globally: private SensorManager sensorManager; To retrieve android.hardware.SensorManager for accessing sensors, you have to use getSystemService(). sensorManager = SensorManager.getSystemService(Context.SENSOR_SERVICE); As our target is the accelerometer sensor specifically, it is set as the default sensor here. It is recommended to always check the sensor availability before using it in the code. The procedure to do so is explained in this guide.
      To make the accelerometer the default sensor:
      Sensor sensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER); To get continuous data from your Galaxy Watch, you need to register a listener to notify you if there is new data. This is done using a SensorEeventListener in Android’s Sensor API. sensorManager.registerListener(listener, sensor, SensorManager.SENSOR_DELAY_NORMAL); The listener method onSensorChanged() is called whenever new data is available. The new data is processed in the listener. private SensorEventListener listener = new SensorEventListener() { @Override public void onSensorChanged(SensorEvent sensorEvent) { // for absolute values X = Math.abs(sensorEvent.values[0]); //0 -> X Axis 1-> Y Axis 2 -> Z Axis Y = Math.abs(sensorEvent.values[1]); Z = Math.abs(sensorEvent.values[2]); Log.e("--MainActivityTag--", "X: " + X + "\n" + "Y: " + Y + "\n" + "Z: " + Z); // do whatever you want to do with the data } @Override public void onAccuracyChanged(Sensor sensor, int i) { } }; Here, onAccuracyChanged(Sensor sensor, int i) is a part of the SensorEventListener interface. It is triggered when the accuracy of a sensor changes. However, for the accelerometer, it is called rarely, as the accelerometer data accuracy usually remains constant.
      Unregister the listener when the data collection is over. Otherwise, it can cause unusual battery consumption. Test the Code Sample
      You can check out the sample app (download it using the link below) and try it out on your Galaxy Watch 4 and later.
      AccelerometerDataExample.zip (332.2 KB)
      Run the sample project on your Galaxy Watch. You will see the following screen.
      Figure 2: Output of the sample project (accelerometer data on Galaxy Watch)
      Accelerometer Data Units and Conversion for Galaxy Watch
      In the application end, raw accelerometer data from Galaxy Watch is converted into meters per second squared (m/s²).
      Equation
      raw data * 9.80665 (gravity force) / 4096 (8g rescale)
      Example
      Assume,
      raw_x = raw data received from the sensor
      acc_x = accelerometer data in application end
      if raw_x = 100
      acc_x = 100 * 9.80665 / 4096
      After this, acc_x is received by the application, containing the Acceleration value in m/s².
      Convert the Data into G-Force Units
      The conversion from m/s² to g is: 1 / 9.80665
      So 1 m/s² =0.10197g
      Information about the Accelerometer Sensor
      The accelerometer provides the 3 axis values separately. The sampling rate of the accelerometer is usually a multiple of 50 Hz, but 100 Hz is also supported. The range of the accelerometer is +- 8G. Sampling rate:
      #Maximum Delay: https://developer.android.com/reference/android/hardware/Sensor#getMaxDelay() // 160 msec
      #Minimum Delay: https://developer.android.com/reference/android/hardware/Sensor#getMinDelay() // 10 msec It is always recommended to read calibrated data to avoid unnecessary noise. To get the result in g-force units, you need to divide the accelerometer values by 4096 (along every axis). It is recommended to use a filter while reading any sensor data. Make sure to always unregister the listener and stop all the services after using. Failure to do so can cause excessive battery drain. There are some restrictions of using background services for Galaxy Watch. Conclusion
      For a Galaxy Watch running Wear OS powered by Samsung, accelerometer data is widely used in fitness tracking, fall detection, gesture recognition and motion analysis. Moreover, data conversion enables precise tracking for applications.
      In this article, we’ve seen one of the ways of reading accelerometer sensor data on a Galaxy Watch running Wear OS powered by Samsung. You can also read sensor data using the Samsung Health Sensor SDK. For more details on Samsung Health, check here.
      If you have any questions about or need help with the information in this article, you can reach out to us on the Samsung Developers Forum or contact us through Developer Support.
      View the full blog at its source
    • By Samsung Newsroom
      Samsung Electronics today announced that approximately 80 models in its 2025 TV, monitor and soundbar lineups have received Product Carbon Reduction1 and Product Carbon Footprint2 certifications from TÜV Rheinland, a globally recognized certification organization based in Germany. This marks the fifth consecutive year that the premium lineups, Neo QLED 8K and Neo QLED, have received certifications, reinforcing the company’s continued efforts in carbon reduction.
       
      “Samsung Electronics is committed to driving technological innovation for a sustainable future,” said Taeyong Son, Executive Vice President of Visual Display Business at Samsung Electronics. “As the world’s leading TV manufacturer, we will continue to be at the forefront of establishing a more energy-efficient ecosystem that benefits consumers.”
       
      Following last year’s certification of 60 models across the Neo QLED, OLED and Lifestyle TV categories, Samsung has further increased its number of certified products in 2025 to include QLED TVs. In addition, the company is also working towards obtaining certification for its Color E-Paper lineup later this year.
       

       
      The certifications from TÜV Rheinland are awarded following a rigorous evaluation of a product’s entire lifecycle — including manufacturing, transportation, usage and disposal — based on internationally recognized sustainability standards. By assessing and verifying carbon emissions at each stage, these certifications highlight Samsung’s efforts to reduce environmental impact across its product lineup.
       
      In particular, the Product Carbon Reduction certification is granted to products that have already received a Product Carbon Footprint certification and further demonstrate a measurable reduction in carbon emissions compared to their predecessors.
       
      Samsung’s leadership in energy-efficient display technology dates back to 2021, when the Neo QLED became the first 4K and higher-resolution TV to earn the Reducing CO2 certification. Since then, Samsung has continually expanded its portfolio of environmentally certified products, including QLED, Crystal UHD, Lifestyle TVs, OLED TVs and a wide range of monitors and digital signage products.
       
      For more information on Samsung’s 2025 TV lineup, please visit www.samsung.com.
       
       
      1 38 Certified models include Neo QLED 8K(QN990F, QN950F), Neo QLED 4K(QN90F, QN85F), OLED(S95F 55”/65”, S90F, S85F 77”/83”), The Frame Pro(LS03FW), LCD Signage(QMC 43”, 50”, 55”, 75”), and Soundbar(Q930F, Q800F, QS700F) products.
      1 42 Certified models include Neo QLED 8K(QN900F), Neo QLED 4K(QN80F, QN70F), OLED(S95F 77”/83”, S85F 55”/65”), The Frame(LS03F), QLED(Q8F, Q7F), Viewfinity S80UD, S80D, QMC 65’’/85’’, Soundbar(Q990F), EMDX 32″.
      View the full article
    • By Samsung Newsroom
      Quantum dots have attracted attention as next-generation material for a wide range of applications including displays, medical devices and solar cells. In 2014, Samsung Electronics developed the world’s first no-cadmium quantum dot material and successfully commercialized quantum dot technology with its SUHD TVs. Since 2017, the company has continued to build on its legacy of quantum dot mastery through QLED — its own quantum dot TV series. Samsung Newsroom explored how quantum dots are taking Samsung displays to the next level.
       
       
      Quantum Dots: The Next Generation of Display Innovation
      Quantum dots are ultra-fine semiconductor particles that are tens of thousands of times thinner than a human hair. Since inception, their physical characteristics that allow them to provide the highest level of color accuracy and brightness among existing materials had them positioned to revolutionize display technology.
       
      When used in displays, quantum dots support a wide color gamut that closely matches colors perceived by the human eye and facilitate pixel-level light adjustment for more accurate black levels. Emitting light in all directions, quantum dots deliver uniform luminance and consistent color from any viewing angle while minimizing blue light exposure for a more comfortable viewing experience.
       
      ▲ SUHD TVs at CES 2015
       
       
      What Sets Quantum Dot TVs Apart: Content, Film Quality and No-Cadmium Technology
      The TV industry continues research and development into the commercialization of quantum dots as the material becomes a game-changer in display technology. For that reason, a variety of quantum dot TVs have hit the market recently — offering a wide range of options to customers.
       
      However, key differences in quantum dot TVs lie in how the technology is implemented and the overall quality of the display. To ensure a premium viewing experience, factors such as the amount of quantum dot content, the quality of quantum dot film and the use of no-cadmium materials must be considered.
       
      ▲ Factors to consider when selecting a high-quality quantum dot TV
       
       
      Quantum Dot Content
      The true quality of a quantum dot TV is defined by its quantum dot content. The quantum dot layer requires a minimum of 30 parts per million (ppm) of the material to achieve the vivid, rich picture quality and color expression that only quantum dots can deliver.
       
       
      Quantum Dot Film
      Quantum dot displays have a simpler and more efficient structure compared to LCDs. Samsung QLEDs eliminate the need for a phosphor layer, enhancing light and energy efficiency while delivering more vivid colors. A quantum dot OLED (QD-OLED), which consists of a thin-film transistor (TFT) layer,1 a self-emitting light source and a quantum dot film that uses the light emitted from the light source, takes a step further enhancing picture quality. In either case, a dedicated quantum dot film that contains sufficient quantum dots is key in delivering top-class picture quality and longevity.
       
      ▲ A comparison of QD-OLED and LCD displays
       
       
      No Cadmium
      In the early stages of developing quantum dot TVs, cadmium was essential to achieving the key benefits of quantum dots such as color reproduction and contrast ratio. At the time, cadmium was considered the most efficient material for producing quantum dots.
       
      However, cadmium’s toxicity became a significant obstacle to the commercialization of quantum dot technology. The element posed serious threats to the environment — making its widespread use difficult despite being the most suitable material for implementing quantum dot technology.
       
      In response to this challenge, Samsung developed the world’s first no-cadmium quantum dot material in 2014 and successfully commercialized quantum dot technology with its SUHD TVs in the following year to open a new era of quantum dot TVs.
       
       
      10 Years of Quantum Dot Innovation and Leadership
      Samsung has quickly recognized the potential of quantum dot technology and led innovation in the global display market over the past decade through continuous research and investment.
       
      ▲ A timeline of Samsung’s quantum dot technology development from 2001 to 2022
       
      Samsung began researching and developing quantum dot technology in 2001 — at a time when there was limited research on non-cadmium materials. Achieving vivid colors required making the nano-sized particles uniform, but the lack of technology and research made mass production extremely challenging.
       
      Despite these obstacles, Samsung succeeded in creating a no-cadmium nanocrystal material in 2014. Since then, the company has accumulated extensive expertise — registering more than 150 patents — and continuously worked on advancing the technology. Samsung’s long-standing commitment culminated in 2015 when the company unveiled the world’s first SUHD TVs with no-cadmium quantum dot technology.
       
      ▲ QLED TVs (75Q8C and 88Q8F) at Samsung’s First Look 2017 event during CES 2017
       
      Samsung’s QLED lineup was revealed in 2017, setting a new standard for premium TVs that overcame the limitations of OLED TVs. By applying metal quantum dot technology, Samsung achieved the Digital Cinema Initiative’s color standard DCI-P3 and achieved 100% color volume for the first time in the world — thereby presenting unparalleled color expression. Notably, the use of inorganic quantum dot technology protected the screens from burn-in2 to ensure consistent picture quality over time.
       
      ▲ (From left to right) Kwang-Hee Kim, Dr. Taehyung Kim, Dr, Eunjoo Jang, Sungwoo Kim and Seon-Myeong Choi from Samsung Advanced Institute of Technology
       
      Following its success in developing a red light-emitting element for displays in 2019, the company enhanced the luminous efficiency of blue self-emitting QLEDs — considered the most challenging to implement among the three primary QLED colors3 — to an industry-leading 20.2%.
       
      “Discovering a blue material for self-emitting QLEDs and demonstrating industry-leading performance at the device level were significant achievements of this research,” said Dr. Eunjoo Chang, a fellow at Samsung Advanced Institute of Technology. “Samsung’s distinctive quantum dot technology has once again overcome technical barriers.”
       
      This cutting-edge advancements led to the launch of the QD-OLED TVs, making history at CES 2022 by winning the Best of Innovation award for integrating quantum dot technology and OLED displays.
       
      Samsung remains dedicated to advancing quantum dot technology through continuous innovation. The company continues to invest in leading display technology — from QLED to Neo OLED — by offering high brightness, color accuracy and frequency. Driven by Samsung’s unrivaled quantum dot innovations, the future of display technology is brighter than ever.
       
       
      1 An electronic circuit that adjusts and controls the light-emitting layers
      2 Occurs when a static image is displayed for too long, causing color distortions or ghost images to remain on screen
      3 Red, green and blue
      View the full article
    • By Samsung Newsroom
      November 2024 Samsung Health Data SDK: Unlock Health Data Insights
      We are proud to introduce the newly released Samsung Health Data SDK, which is an innovative tool that helps developers to integrate health insights into their applications. The Health Data SDK lets you integrate different data including sleep, activity level, and heart rate. You can provide customized health experiences that ultimately enhance user support, engagement, in-depth analysis, and overall user health based on the data. Learn more about the Samsung Health Data SDK that provides data-driven insights benefiting users throughout their health journey.
        Learn more One UI Design Guidelines Updated
      Our One UI Design Guidelines for application developers have been updated. The latest design system changes for One UI including changes to Home screen and notifications are included in this update. It also includes application design guidelines and related use cases for providing an optimized user experience for a variety of devices such as Galaxy tablets, Fold, and Flip. Read through the new design guidelines to design your application to be optimized for different devices.
        Learn more SDC24 Korea Hosted Online on November 21
      Samsung Developer Conference Korea (SDC24 Korea) was successfully held on November 21. Having started with the CTO's opening speech, the event included keynote speeches from prominent speakers about Samsung Electronics' achievements in generative AI technology research & development and enhancement of the user experience on software and device platforms. 

      Over 29 in-depth tech sessions took place as well as some interesting small events. Moreover, the event built on the Samsung Developer Conference 2024 (SDC24) held in the USA on October 3, creating a lively platform for exchange where the participants could learn, share, and connect through a wealth of content. Watch the videos on the official SDC24 website (www.sdc-korea.com).
        Learn more Code Lab Highlights from SDC24
      At the recently concluded Samsung Developer Conference 2024 (SDC24), one of the standout programs was the Code Lab, where attendees could try different hands-on labs and dive deep into the latest Samsung SDKs and tools. The Code Lab covered a wide range of technologies, including SmartThings, Samsung Health, Samsung Wallet, and Automotive. Check out the Code Lab highlights from SDC24.
        Learn more Tutorial: Maintain Galaxy Store Compatibility for Unity Games with Play Asset Delivery (PAD)
      The Unity game engine is one of the leading engines for Android game development. It allows developers to take advantage of Android application bundle format features such as Play Asset Delivery (PAD). However, since PAD is exclusive to Google Play, uploading Unity games using PAD to Galaxy Store without any changes may cause various issues.

      This tutorial walks you through the implementation of PAD in your Unity Games while maintaining compatibility with the Galaxy Store. It discusses how to implement PAD properly in Unity games and make simple changes to make them compatible with Galaxy Store. The changes required to make existing games compatible with PAD are also covered. Click the link below to learn more.

      Learn more SmartThings Product Cloning and Certification by Similarity
      SmartThings At SmartThings, we’re committed to making it quick and easy for you to become part of our ecosystem. We are excited to introduce our latest certification features: Product Cloning and Certification by Similarity.

      Many smart home device manufacturers have product portfolios across various categories. These products may have different colors, shapes, or differences in specifications by country but are often essentially similar. However, getting them all certified one by one can be time-consuming and costly. Product Cloning and Certification by Similarity were developed to make it easier, faster, and cheaper to obtain the Works with SmartThings certification. Click the link below to learn more.

      Learn more Blind Face Video Restoration with Temporally Consistent Generative Prior and Degradation-Aware Prompt
      In real-world scenarios, both face images and videos may suffer from various types of degradation, such as downsampling, noise, blur, and compression. Blind Face Restoration (BFR) is a challenging task that aims to restore low-quality face images and videos that suffer from unknown degradation. Existing BFR methods have used facial priors such as reference prior, geometry prior, and generative prior in the network structure to perform restoration. However, they mostly focused on blind face image restoration (BFIR) with still images and have not been fully utilized for video restoration. 

      In this study, we present a new method called Stable Blind Face Video Restoration (StableBFVR). With StableBFVR, we have introduced temporal layers in the Stable Diffusion model that can preserve temporal consistency. The temporal layers are designed using two core technologies: Shift-Resblock that handles long-term information, and Nearby-Frame Attention that utilizes short-term information. Find out more about StableBFVR and how it performs better than existing methods at the Samsung Research blog.

      Learn more SAMSEMO: New Dataset for Multilingual and Multimodal Emotion Recognition
      Multimodal emotion recognition, recently gaining popularity, is a study area that analyzes a variety of communication signals including images, voices, and text in a multilateral way. However, the list of large-scale multimodal datasets is very short and the available datasets have various limitations. Hence, Samsung R&D Institute Poland presents a new dataset for multimodal and multilingual emotion recognition: Samsung Multimodal and Multilingual Dataset for Emotion Recognition (SAESEMO).

      SAESEMO contains over 23,000 video scenes in 5 languages (Korean, English, German, Spanish, and Polish) collected from diverse sources. All video scenes are accompanied with rich metadata and emotion connotations collected manually. The study also analyzes balance and energy of audio features for the most important emotion classes and compares them with CMU-MOSEI data. Moreover, it carries out multimodal experiments for emotion recognition with SAESEMO and shows how to use a multilingual model to improve the detection of imbalanced classes. Learn more at the Samsung Research blog.

      Learn more Technology Innovation towards mmWave Fixed Wireless Access
      Fixed Wireless Access (FWA) is expected to drive 5G subscriber growth, with global subscriptions reaching 265 million by 2029. FWA users can consume 20 to 30 times more data compared to regular mobile users, placing a significant strain on the network.

      Most existing FWA systems operate in Frequency Range 1 (FR1), meaning frequencies of 6 GHz or below. However, using Frequency Range 2 (FR2), which uses frequencies ranging from 24.25 to 71 GHz, can help improve FWS coverage and data demands. This article discusses key technology innovations that enable improved coverage and capacity for FR2-based FWA systems of 5G and 6G. Learn more at the Samsung Research blog.

      Learn more   
      https://developer.samsung.com

      Copyright© %%xtyear%% SAMSUNG All Rights Reserved.
      This email was sent to %%emailaddr%% by Samsung Electronics Co.,Ltd.
      You are receiving this email because you have subscribed to the Samsung Developer Newsletter through the website.
      Samsung Electronics · 129 Samsung-ro · Yeongtong-gu · Suwon-si, Gyeonggi-do 16677 · South Korea

      Privacy Policy       Unsubscribe

      View the full blog at its source





×
×
  • Create New...