Quantcast
Jump to content

Remote Device Manager, An Easy Way to Launch Your Application with Tizen Studio


STF News
 Share

Recommended Posts

2021-04-12-01-banner.jpg.png

The Remote Device Manager provides a mechanism to deploy a project remotely from Tizen Studio to a Tizen-enabled device, such as Galaxy Watch. Tizen-enabled devices can be connected or disconnected through the Remote Device Manager if they are on the same network. Once the connection is made, a device log is shown in the Log View. You can also use the interface of the Remote Device Manager for executing SDB shell commands.

Prerequisites: Tizen Studio 2.0 or higher

Launch a project with Remote Device Manager

Step 1: Disable Bluetooth

If the watch has not been upgraded and the Tizen version is below 5.0, Bluetooth should be disabled during this process. In upgraded watches, you don’t need to disable Bluetooth.

Path: Settings > Connections > Bluetooth

Figure 1

Figure 1: Disabling Bluetooth

Step 2: Enable debugging mode

Make sure debugging mode is enabled. You can enable debugging mode from the Settings menu, as shown below.

Path: Settings > About Watch > Debugging is turned on

Figure 2

Figure 2: Enabling debugging mode

Step 3: Set the Wi-Fi to Always on

This step is optional, but to avoid any unnecessary issues, it is better to set the Wi-Fi to Always on. Leaving the setting on Auto can sometimes create issues.

Path: Settings > Connections > Wi-Fi > Always on

Figure 3

Figure 3: Setting the Wi-Fi to Always on

Caution: Setting the Wi-Fi to Always on can drain the battery drastically. After debugging, it should be set back to Auto again for better battery life.

Step 4: Connect to the network

Connect the watch to the same network as your PC.

Choose either of the following ways to connect the devices to the same network:
• By creating a mobile hotspot
• By using Wi-Fi under the same router

Step 5: Restart the watch

After the previous steps have been completed, restart the device. If you do not, the connection setup shows an error.

Figure 4

Figure 4: Rebooting the watch

Step 6: Establish the connection from the Remote Device Manager

  1. In Tizen Studio, go to Launch Remote Device Manager.

    Figure 5

    Figure 5: Launching the Remote Device Manager

  2. Scan for new devices. The window shows a list of available devices and their IP addresses. You can also add a device manually from the Remote Device Manager window.

    Figure 6

    Figure 6: Searching for available devices for connection

  3. To connect to the device, click on the Connect toggle next to the watch IP address and port information. The watch receives an RSA authentication request through a pop-up during this connection setup and it is mandatory to accept the RSA authentication to complete the process.

Figure 6

Figure 7: Connecting to the watch from the Remote Device Manager

You are now all set to deploy your app from Tizen Studio to the wearable device.

Step 7: Permit to install user applications

As a security feature, the device or emulator you have connected to does not contain the necessary certificates for installing user applications, and you must install them before being able to run your application on it. To do so, select “Permit to install applications” from the context menu of the device in the Device Manager.

If the “The permit to install application is not required for this device” appears, this step is unnecessary.

Figure 8

Figure 8: Setting the permit to install applications in the Device Manager

Step 8: Launch your project

Now, deploy your project on your connected watch, as shown in the image below.

Path: Right-Click on the project > Run As > 1 Tizen Native Application

Figure 9

Figure 9: Deploying the project from Tizen Studio to a connected watch

Some helpful tips for connecting your device with the Remote Device Manager

  • Check the IP address of your watch from Connections > Wi-Fi -> Wi-Fi Networks > tap on the SSID (your Wi-Fi name) > IP address.

  • If your device is already shown in the Remote Device Manager's history, delete it and try to connect again.

  • Launch the Device Manager to see the Log View.

    Figure 10

    Figure 10: The Log View from Device Manager

  • Make sure the watch is not connected with any other devices, including a phone. Otherwise, the connection fails and you receive the following error message:

    Figure 11

    Figure 11: Error message during multiple connections

  • If you cannot find the watch after scanning for devices from the Remote Device Manager, make sure your device is on the same network. To check this, go to the command prompt on your PC and ping the IP address of the watch in the following manner:
    ping < Watch_IP >

    If the ping command fails to connect to the IP address of your watch, it is not on the same network, and the SDB / Remote Device Manager does not work. To fix this, you need to change the network settings of your router or PC. The issue can also be caused by firewall settings, although this is rare.

Conclusion

The main purpose of this article is to help new developers to deploy Tizen projects to a real device using the Tizen Remote Device Manager. Hopefully, this tutorial is helpful for beginners and gives them a good experience with Tizen Studio.

If you have any other problems or queries regarding launching projects with the Remote Device Manager, feel free to reach out through the Samsung Developers Forum.

View the full blog at its source

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Similar Topics

    • By kostass123
      Dear friends,
      I am looking to buy a new TV for my mother who is quite old and cannot see very well.
      I know it may sound too picky but I am looking for a Samsung TV in which I can access the TV guide with the push of just one button.
      That's what I mean by TV guide:

      My mom's old Samsung has the "GUIDE" button on the remote to do that.
      The remote was like that: 

       
      I really like the UE43AU9072 but I think it only comes with the new smart remote which has very few buttons.
      As I said, I need to access the TV guide with the push of a single button just like on my old TV.
      Can the new smart TV remote do that? If not, is it possible to use my old IR remote?
       
      Maybe one of you guys with new TVs can help! 
      Thanks!
    • By STF News
      This blog post reproduces a press release available at the Samsung Newsroom.
      Samsung Electronics Co., Ltd. today revealed the revamped Samsung Remote Test Lab program, which allows developers to easily test their applications virtually and remotely on thousands of Samsung Galaxy devices, now including the Galaxy S21 series with One UI 4 beta, and the latest Galaxy Z Fold3 5G and Z Flip3 5G. The Remote Test Lab program makes it much more efficient for developers to test their new applications, as they don’t have to cover any hardware costs.
      With global access to the Remote Test Lab program, developers from around the world will be able to test their applications remotely on Galaxy test devices, regardless of the country they reside in. Over the last ten years, Samsung has built Remote Test Labs around the world, including in India, Poland, South Korea, and the United States. To make access to the program’s services faster and more reliable than ever before1, Samsung has now scaled up the program to include labs in Brazil, Russia, the United Kingdom, and Vietnam.
       

      Poland: Organizing the library of Galaxy test devices.
       
      "We wouldn’t have been able to offer such innovative mobile experiences to our users without our incredible developer community" said TM Roh, President and Head of Mobile Communications Business, Samsung Electronics. "That’s why we are investing our expertise and innovative technology to support developers—they are crucial partners as we continue to expand our open Galaxy ecosystem."
       

      Brazil: Making sure Remote Test Lab program operations are carried out smoothly.
       
      To create an enhanced test experience, Samsung released the new version of the program2. Developers will get access to an automated testing tool that will allow them to monitor CPU and memory consumption when running their apps and check if an application has any impact on system resources. And thanks to a new web client service, the program features an improved user interface and enhanced graphics. In the near future, developers will be able to use a new communication channel, giving them direct access to Samsung experts who can advise them on how to improve their app performance.
       

      USA: Running checks and maintenance on software for the Galaxy test devices.
       
      The new Samsung Remote Test Lab program showcases Samsung’s commitment to helping partners and developers take advantage of revolutionary features, like the foldable experience of the Galaxy Z series, defence-grade security with Samsung Knox, and seamless connectivity with One UI. Through the rebooted Remote Test Lab program, Samsung will continue to partner closely with developers all over the world, helping them push boundaries and unlock new heights of innovation.
      To find out more about the Samsung Remote Test Lab, visit: developer.samsung.com/remote-test-lab.
      About Samsung Electronics Co., Ltd.
      Samsung inspires the world and shapes the future with transformative ideas and technologies. The company is redefining the worlds of TVs, smartphones, wearable devices, tablets, digital appliances, network systems, and memory, system LSI, foundry and LED solutions. For the latest news, please visit the Samsung Newsroom at news.samsung.com.
      [1] The test speed of the Remote Test Lab program may be affected by the distance from the server, the data rate and data traffic of the network connection and the local network policy (firewall).
      [2] Access to new features may vary by model, SW version and market. Developers must be logged into the Samsung Developers site using a Samsung Account. To access the web client service, developers will need to use the Chrome browser.
      View the full blog at its source
    • By STF News
      Are your apps ready for the recently unveiled Galaxy Z Fold3? For your app to work seamlessly on foldable devices, it will need to correctly handle App Continuity and Flex Mode.
      With Remote Test Lab, you can test your apps on a real Galaxy Z Fold3. To begin, create an account or login on Samsung Developers, and then visit this link to access the Galaxy Z Fold3.
      Other recent devices, such as the Galaxy Tab S7+, Galaxy S21 Ultra, and Galaxy Note20 Ultra are also available on Remote Test Lab.
      View the full blog at its source
    • By STF News
      The Samsung Developers team works with many companies in the mobile and gaming ecosystems. We're excited to support our partner, Arm, as they bring timely and relevant content to developers looking to build games and high-performance experiences. This Vulkan Extensions series will help developers get the most out of the new and game-changing Vulkan extensions on Samsung mobile devices.
      Android R is enabling a host of useful Vulkan extensions for mobile, with three being key 'game changers'. These are set to improve the state of graphics APIs for modern applications, enabling new use cases and changing how developers can design graphics renderers going forward. You can expect to see these features across a variety of Android smartphones, such as the new Samsung Galaxy S21, and existing Samsung Galaxy S models like the Samsung Galaxy S20. The first blog explored the first game changer extension for Vulkan – ‘Descriptor Indexing'. This blog explores the second game changer extension – ‘Buffer Device Address.’
      VK_KHR_buffer_device_address
      VK_KHR_buffer_device_address is a monumental extension that adds a unique feature to Vulkan that none of the competing graphics APIs support.
      Pointer support is something that has always been limited in graphics APIs, for good reason. Pointers complicate a lot of things, especially for shader compilers. It is also near impossible to deal with plain pointers in legacy graphics APIs, which rely on implicit synchronization.
      There are two key aspects to buffer_device_address (BDA). First, it is possible to query a GPU virtual address from a VkBuffer. This is a plain uint64_t. This address can be written anywhere you like, in uniform buffers, push constants, or storage buffers, to name a few.
      The key aspect which makes this extension unique is that a SPIR-V shader can load an address from a buffer and treat it as a pointer to storage buffer memory immediately. Pointer casting, pointer arithmetic and all sorts of clever trickery can be done inside the shader. There are many use cases for this feature. Some are performance-related, and some are new use cases that have not been possible before.
      Getting the GPU virtual address (VA)
      There are some hoops to jump through here. First, when allocating VkDeviceMemory, we must flag that the memory supports BDA:
      VkMemoryAllocateInfo info = {…}; VkMemoryAllocateFlagsInfo flags = {…}; flags.flags = VK_MEMORY_ALLOCATE_DEVICE_ADDRESS_BIT_KHR; vkAllocateMemory(device, &info, NULL, &memory); Similarly, when creating a VkBuffer, we add the VK_BUFFER_USAGE_SHADER_DEVICE_ADDRESS_BIT_KHR usage flag. Once we have created a buffer, we can query the VA:
      VkBufferDeviceAddressInfoKHR info = {…}; info.buffer = buffer; VkDeviceSize va = vkGetBufferDeviceAddressKHR(device, &info); From here, this 64-bit value can be placed in a buffer. You can of course offset this VA. Alignment is never an issue as shaders specify explicit alignment later.
      A note on debugging
      When using BDA, there are some extra features that drivers must support. Since a pointer does not necessarily exist when replaying an application capture in a debug tool, the driver must be able to guarantee that virtual addresses returned by the driver remain stable across runs. To that end, debug tools supply the expected VA and the driver allocates that VA range. Applications do not care that much about this, but it is important to note that even if you can use BDA, you might not be able to debug with it.
      typedef struct VkPhysicalDeviceBufferDeviceAddressFeatures { VkStructureType sType; void* pNext; VkBool32 bufferDeviceAddress; VkBool32 bufferDeviceAddressCaptureReplay; VkBool32 bufferDeviceAddressMultiDevice; } VkPhysicalDeviceBufferDeviceAddressFeatures; If bufferDeviceAddressCaptureReplay is supported, tools like RenderDoc can support BDA.
      Using a pointer in a shader
      In Vulkan GLSL, there is the GL_EXT_buffer_reference extension which allows us to declare a pointer type. A pointer like this can be placed in a buffer, or we can convert to and from integers:
      #version 450 #extension GL_EXT_buffer_reference : require #extension GL_EXT_buffer_reference_uvec2 : require layout(local_size_x = 64) in; // These define pointer types. layout(buffer_reference, std430, buffer_reference_align = 16) readonly buffer ReadVec4 { vec4 values[]; }; layout(buffer_reference, std430, buffer_reference_align = 16) writeonly buffer WriteVec4 { vec4 values[]; }; layout(buffer_reference, std430, buffer_reference_align = 4) readonly buffer UnalignedVec4 { vec4 value; }; layout(push_constant, std430) uniform Registers { ReadVec4 src; WriteVec4 dst; } registers; Placing raw pointers in push constants avoids all indirection for getting to a buffer. If the driver allows it, the pointers can be placed directly in GPU registers before the shader begins executing.
      Not all devices support 64-bit integers, but it is possible to cast uvec2 <-> pointer. Doing address computation like this is fine.
      uvec2 uadd_64_32(uvec2 addr, uint offset) { uint carry; addr.x = uaddCarry(addr.x, offset, carry); addr.y += carry; return addr; } void main() { uint index = gl_GlobalInvocationID.x; registers.dst.values[index] = registers.src.values[index]; uvec2 addr = uvec2(registers.src); addr = uadd_64_32(addr, 20 * index); Cast a uvec2 to address and load a vec4 from it. This address is aligned to 4 bytes.
      registers.dst.values[index + 1024] = UnalignedVec4(addr).value; } Pointer or offsets?
      Using raw pointers is not always the best idea. A natural use case you could consider for pointers is that you have tree structures or list structures in GPU memory. With pointers, you can jump around as much as you want, and even write new pointers to buffers. However, a pointer is 64-bit and a typical performance consideration is to use 32-bit offsets (or even 16-bit offsets) if possible. Using offsets is the way to go if you can guarantee that all buffers live inside a single VkBuffer. On the other hand, the pointer approach can access any VkBuffer at any time without having to use descriptors. Therein lies the key strength of BDA.
      Extreme hackery: physical pointer as specialization constants
      This is a life saver in certain situations where you are desperate to debug something without any available descriptor set.
      A black magic hack is to place a BDA inside a specialization constant. This allows for accessing a pointer without using any descriptors. Do note that this breaks all forms of pipeline caching and is only suitable for debug code. Do not ship this kind of code. Perform this dark sorcery at your own risk:
      #version 450 #extension GL_EXT_buffer_reference : require #extension GL_EXT_buffer_reference_uvec2 : require layout(local_size_x = 64) in; layout(constant_id = 0) const uint DEBUG_ADDR_LO = 0; layout(constant_id = 1) const uint DEBUG_ADDR_HI = 0; layout(buffer_reference, std430, buffer_reference_align = 4) buffer DebugCounter { uint value; }; void main() { DebugCounter counter = DebugCounter(uvec2(DEBUG_ADDR_LO, DEBUG_ADDR_HI)); atomicAdd(counter.value, 1u); } Emitting SPIR-V with buffer_device_address
      In SPIR-V, there are some things to note. BDA is an especially useful feature for layering other APIs due to its extreme flexibility in how we access memory. Therefore, generating BDA code yourself is a reasonable use case to assume as well.
      Enables BDA in shaders.
      _OpCapability PhysicalStorageBufferAddresses OpExtension "SPV_KHR_physical_storage_buffer"_ The memory model is PhysicalStorageBuffer64 and not logical anymore.
      _OpMemoryModel PhysicalStorageBuffer64 GLSL450_ The buffer reference types are declared basically just like SSBOs.
      _OpDecorate %_runtimearr_v4float ArrayStride 16 OpMemberDecorate %ReadVec4 0 NonWritable OpMemberDecorate %ReadVec4 0 Offset 0 OpDecorate %ReadVec4 Block OpDecorate %_runtimearr_v4float_0 ArrayStride 16 OpMemberDecorate %WriteVec4 0 NonReadable OpMemberDecorate %WriteVec4 0 Offset 0 OpDecorate %WriteVec4 Block OpMemberDecorate %UnalignedVec4 0 NonWritable OpMemberDecorate %UnalignedVec4 0 Offset 0 OpDecorate %UnalignedVec4 Block_ Declare a pointer to the blocks. PhysicalStorageBuffer is the storage class to use.
      OpTypeForwardPointer %_ptr_PhysicalStorageBuffer_WriteVec4 PhysicalStorageBuffer %_ptr_PhysicalStorageBuffer_ReadVec4 = OpTypePointer PhysicalStorageBuffer %ReadVec4 %_ptr_PhysicalStorageBuffer_WriteVec4 = OpTypePointer PhysicalStorageBuffer %WriteVec4 %_ptr_PhysicalStorageBuffer_UnalignedVec4 = OpTypePointer PhysicalStorageBuffer %UnalignedVec4 Load a physical pointer from PushConstant.
      _%55 = OpAccessChain %_ptr_PushConstant__ptr_PhysicalStorageBuffer_WriteVec4 %registers %int_1 %56 = OpLoad %_ptr_PhysicalStorageBuffer_WriteVec4 %55_ Access chain into it.
      _%66 = OpAccessChain %_ptr_PhysicalStorageBuffer_v4float %56 %int_0 %40_ Aligned must be specified when dereferencing physical pointers. Pointers can have any arbitrary address and must be explicitly aligned, so the compiler knows what to do.
      OpStore %66 %65 Aligned 16 For pointers, SPIR-V can bitcast between integers and pointers seamlessly, for example:
      %61 = OpLoad %_ptr_PhysicalStorageBuffer_ReadVec4 %60 %70 = OpBitcast %v2uint %61 // Do math on %70 %86 = OpBitcast %_ptr_PhysicalStorageBuffer_UnalignedVec4 %some_address
      Conclusion
      We have already explored two key Vulkan extension game changers through this blog and the previous one. The third and final part of this game changer blog series will explore ‘Timeline Semaphores’ and how developers can use this new extension to improve the development experience and enhance their games.
      Follow Up
      Thanks to Hans-Kristian Arntzen and the team at Arm for bringing this great content to the Samsung Developers community. We hope you find this information about Vulkan extensions useful for developing your upcoming mobile games.
      The Samsung Developers site has many resources for developers looking to build for and integrate with Samsung devices and services. Stay in touch with the latest news by creating a free account or by subscribing to our monthly newsletter. Visit the Marketing Resources page for information on promoting and distributing your apps and games. Finally, our developer forum is an excellent way to stay up-to-date on all things related to the Galaxy ecosystem.
      View the full blog at its source
    • By STF News
      Samsung Electronics’ award-winning lifestyle TV The Frame continuously transforms the space it’s in by offering users access to over 1,500 pieces of virtual art from some of the world’s most renowned artists, museums and industry tastemakers.
       
      This month, the Art Store is highlighting those masterpieces that bring the most beautiful scenery found around the planet straight to your living room with its World Environment Day spotlight, a curated set of art pieces that feature breathtaking scenes from all over the world. One artist whose work is featured in this spotlight is Cody Cobb, a photographer who travels to some of the most remote places known to man to capture nature the way he believes it should remain: untouched.
       
      Figure 1. In my “studio” which is in the landscape itself
       
       
      Bringing the Beauty of Our Planet to More People
      Cobb has worked in photography for the past 20 years, first as an artistic pursuit that began as a part of his digital illustration projects and subsequently developing to be his primary medium. He travels to remote places including the vast American West in order to find unique landscapes, natural formations and stand-out moments to capture. He spends much of his time working alone in order to allow for serendipity to lead him to the discovery of the exceptional scenes captured in his work.
       
      ▲ Death Valley (2015)
       
      “I am not sure that I can turn off the part of my brain that is engaged when I am photographing,” noted Cobb of the ways he harnesses his camera to capture his unique view of the world. “I observe the world in a specific way, obsessively finding details and patterns. Sometimes I will happen to have a camera with me, and that’s when I am able to capture those observations. Photography is a natural extension of my way of experiencing the world.”
       
      By bringing his works to the selection available to users on The Frame, Cobb hopes that people will be able to experience the world in the way that he is able to when he is taking photographs. “I would love for those who view my work to be transported somewhere else for a moment,” he noted. “I hope that they can experience the sense of stillness that I myself enjoy while shooting.”
       
      Cobb also hopes that when a user takes in the incredible scenes he has been lucky enough to see firsthand, they are able to stop for a moment and appreciate the beauty of what they are seeing. “I hope that my artwork offers a nuanced way of experiencing nature,” he said. “I want these places to come across as mysterious and I hope that people are able to get lost in them.”
       
       
      Experiencing the World Through The Frame
      On The Frame, Cobb’s photos are displayed using nano-sized Quantum Dots which offer 100% color volume and bring over a billion shades of precise colors to users so that users can enjoy the artworks with the original color, detailing and texture intended by the artist. Users are able to experience the natural beauty of our planet just as it is, for what feels like a firsthand view of these amazing scenes.
       
      ▲ Parallel World (2017)
       
      “The Frame is such a perfect expression of the cinematic visuals running through my head when I’m creating art,” noted Cobb. “I want my photos to feel alive and the technology powering The Frame allows for that. I also love that my fine art prints can exist on the same wall alongside the work I have available on the Art Store.”
       
       
      Putting Nature First Thanks to Transformative Technologies
      Having collaborated with The Frame’s Art Store since its early days, Cobb understands how important it is to bring art into peoples’ homes and how technology plays a pivotal role in doing so. He is grateful for transformative experiences The Frame offers as a digital platform for displaying art; “Thanks to The Frame, I now have a new audience and a whole new way of sharing my work,” he noted. “It is incredibly motivating.”
       
      An ardent advocate for the environment and the protection of the spaces he captures, Cobb is now selling his work via NFTs (non-fungible tokens) on platforms such as Hic et Nunc as the form of digital currency they employ consumes much less energy than other platforms. For each NFT sale he makes, he is donating a portion of the proceeds to environmental preservation organizations.
       
      Fifteen of Cody’s pieces are available on The Frame’s Art Store now, and two of his photographs are being featured as part of this June’s Environmental Day spotlight.
       

      View the full article
×
×
  • Create New...