Quantcast
Jump to content

Save the Date for the 2020 Best of Galaxy Store Awards


Recommended Posts

2020-11-10-01-banner-bogs.jpg

The 2020 Best of Galaxy Store Awards are just around the corner. For the first two years, these awards were announced during the annual Samsung Developer Conference. However, the awards show has moved online for 2020 which means more people can tune in to see who this year's winners are.

The Best of Galaxy Store awards recognizes the top games, apps, themes, watch faces, and new this year, Bixby capsules.

Last month, in the season one finale of the Samsung Developers podcast we talked about the history of the awards, past highlights of previous awards, exciting new changes to Galaxy Store, and our upcoming 2020 Best of Galaxy Store Awards show. Be sure to tune in and listen.

Curious what makes our past winners stand out? We have interviewed a number of past winners on our blog. Read what inspires them and what their tips are for finding success on Galaxy Store.

Get a sneak peak of this year's awards and save the date for Wednesday, Decemeber 9 at 5pm PST. The awards show will premiere on our YouTube channel so be sure to subscribe and hit the notification bell.

View the full blog at its source

Link to post
Share on other sites


Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Similar Topics

    • By STF News
      Start Date Dec 09, 2020
      Location YouTube
      The excitement has begun! Planning is underway for this year’s Best of Galaxy Store Awards, a ceremony that honors developers and designers who publish content in Galaxy Store. This is Samsung Developers' third year of recognizing excellence in design, innovation, and quality.
      We’re adding exciting new features to our awards ceremony and expanded our awards to include Bixby capsules. Join us as we reveal and celebrate this year's winners!
      When: December 9th, 5pm PST
      Where: Hosted virtually on YouTube
      View the full blog at its source
    • By STF News
      Galaxy AR Emoji SDK for Unity makes it possible to control the user’s AR Emoji avatar in a Unity application developed for Galaxy devices. You can add any custom animation movements to the user’s AR Emoji avatar in your Unity applications using the SDK (for example, a game application for Galaxy devices in which a user can select their own AR Emoji avatar as a player instead of the default game player). In this tutorial, you will learn how to add any custom animation to the AR Emoji avatar in a game or any other application using the Galaxy AR Emoji SDK for Unity.
      Prerequisites
      First, get your Samsung partnership approval from the Partnership Request Process to use the Galaxy AR Emoji SDK for Unity if you are not already a partner. The SDK team will approve the partnership through an immediate review as soon as they receive your partnership request. After getting your partnership approved, go through the basic tutorial by signing in to your account on the Galaxy AR Emoji Developer site.
      In order to animate the AR Emoji model, you need to load the model into your application (described in the “Importing the AR Emoji Model” section in the basic tutorial mentioned above). As our project name is “Animation Tutorial,” we recommend keeping the same name for your project. Besides, you can also add a feature to your app using an Android intent so that a user can create a new avatar by opening the AR Emoji camera directly from your app for a better user experience. Now, you are ready to proceed with this tutorial to animate the loaded model on Galaxy devices.
      Animate the AR emoji
      To animate the avatar, you need the following two components,
      Animation clip: the animation data containing information about animation movement. Animator controller: allows you to arrange a set of animation clips and associated animation transitions for a character or object. Get familiar with how these two components work together to incorporate animation into the AR Emoji model. For this tutorial, you should understand these components in order to follow the successive sections to complete the project.
      Create animation clips
      Our project facilitates four different types of animation for the loaded model, so we need four animation clips. The AR Emoji supports only the ANIM format clips (see how to create animation clips for the AR Emoji using the “AnimationGenTool.unitypackage” package of the SDK).
      For this project, we use the animation clips included in the SDK (see Assets → Resources → Animations → Mecanim). But you can use any custom clips created by using tools such as 3Ds Max, AutoCAD Maya, or tools downloaded from other sources, like Mixamo.

      Figure 1: The animation clips included in the SDK.

      One more thing you need to make sure is that every animation clip has the Loop Time option selected so that the AR Emoji doesn’t stop animating after a single animation movement. To do this, select the animation clip and check the Loop Time in the Inspector. Do not forget to do the same for all the animation clips.

      Figure 2: Enabling the Loop Time for Hip Hop Dancing animation clip.

      Our animation clips are now ready. In the next section, we will configure our animator controller for utilizing these clips.
      Configure the animator controller
      Although you can create your animator controller, the default “AnimController” is available under Assets → Resources.

      Figure 3: The default animator controller for the loaded AR Emoji.

      You need to connect the controller to the loaded model. Follow the steps below:
      Open the “GettingAREmojiSample” scene, select the “GLTFComponent” in the hierarchy, and then make sure to check the Use Legacy Animator in the Inspector.
      Add the animation clips and set appropriate transition conditions between them. Double-Click on “AnimController” to open the Animator tab, which shows the state machine of the controller.

      Figure 4: The state machine of the animator controller.

      Drag all the animation clips to the state machine and place them in a convenient position. Note that “Idle” is the default animation state for our model.

      Figure 5: Drag the animation clips into the state machine.

      It's time to make the transitions between the states and set appropriate conditions. To do so, you need to create one Boolean parameter for each state. Click the + button and then Bool in the Parameters tab as shown in the diagram below, and name them according to their states (for example, walk, jump, yelling, and dance).

      Figure 6: Create a Boolean parameter for each state (animation clip).

      Right-click on the “Idle” state, select Make Transition, and click on any other state (for example, “Hip Hop Dancing”). As a result, a transition arrow from “Idle” to the selected state is generated. Select the arrow as shown below and then perform the following tasks in the Inspector:

      Figure 7: Select the transition arrow.

      Uncheck the Has Exit Time option for the transition.

      Figure 8: Disable the Has Exit Time option.

      Add a condition for the transition by clicking on the + button.

      Figure 9: Add a condition for the transition.

      Select the appropriate parameter (for example, “dance” for the “Hip Hop Dancing” state).

      Figure 10: Select the parameter for the condition.

      Set the value according to the transition condition (in this example, true, which indicates the model will go from the “Idle” to “Hip Hop Dancing” state).

      Figure 11: Select the condition value for the transition.

      We have completed the transition condition for the “Idle“ to “Hip Hop Dancing” state. For the reverse transition, “Hip Hop Dancing“ to “Idle,” everything is the same as above, except the parameter value. The value should be false as the avatar will go back from the “Hip Hop Dancing” state to the “Idle” state. This completes the full transition condition for the “Hip Hop Dancing” state.
      Complete the transition conditions in the same way for the rest of the states (Jump, Walk, and Yelling). The final state machine should look like the following image:

      Figure 12: The complete state machine for our project.

      Modify the application UI
      So far, we have created our animation clips and the animator controller. Now, we need to modify the application UI for the users to interact with the application. As a result, they can animate their avatars by simply tapping a button. We need to add four buttons to the canvas of the “GettingAREmojiSample” scene, which enables the different animations for our model. Follow the steps below:
      Open the “GettingAREmojiSample” scene, add a button to the panel of the canvas (see the screenshot below), rename it as you wish (for example, “jump_button”), and set the width to 200 and the height to 60 in the Inspector.

      Figure 13: Create a button in UI and configure the property in the Inspector.

      Duplicate the button three times to make a total of four buttons, rename and position them conveniently (as shown below), and change the button texts to indicate the appropriate animation actions.

      Figure 14: Add four buttons in the UI and name them according to their actions.

      The UI is now ready along with the animator controller.
      Modify the script
      The final task is to modify the script so that tapping on a button changes the value of the corresponding Boolean parameter within the controller. At this stage, if we can somehow change the value of the Boolean parameter at run-time, the model will be animated accordingly. To change the value by interacting with the buttons, we need to edit the “ButtonBehaviour.cs” script which is attached to the canvas of the “GettingAREmojiSample” scene. Perform the following steps:
      Open the “ButtonBehaviour.cs” script from the Assets → Scripts in any editor (such as Visual Studio).
      Declare the variables for the four buttons at the beginning of the ButtonBehaviour class, as shown below:
      public Button jumpBtn; public Button yellBtn; public Button walkBtn; public Button danceBtn; Add references to these variables from the UI buttons. Go back to Unity and select the canvas in the Hierarchy. In the Inspector, the references are empty for the button variables. Drag each of the buttons from the panel in the Hierarchy to their corresponding empty fields in the Inspector.

      Figure 15: Add button references to button variables.

      Add listeners to the button variables at the end of the Start() method, as shown below:
      walkBtn.onClick.AddListener(OnClickWalk); yellBtn.onClick.AddListener(OnClickYell); jumpBtn.onClick.AddListener(OnClickJump); danceBtn.onClick.AddListener(OnClickDance); Declare four methods/functions for each listener, immediately below the Start() method:
      private void OnClickDance(){ } private void OnClickJump(){ } private void OnClickYell(){ } private void OnClickWalk(){ } When a user taps on a button in our application, the corresponding method is invoked at run-time. Therefore, we have to write some code inside each function to change the value of the corresponding Boolean parameter.
      To access the Boolean parameters within the animator controller, we need to declare an animator variable in the script. Write a line of code to declare the variable at the beginning of the class and add the Update() method inside the class.
      public Animator anim; void Update(){ if (anim == null){ GameObject o = GameObject.Find("rig_GRP"); if (o != null){ anim = o.GetComponent<Animator>(); } } } Finally, when a specific button is tapped by a user, the corresponding Boolean value should be set as true and others as false in the invoked function. See below for an example for one method and complete the others in the same way.
      private void OnClickDance(){ anim.SetBool("walk", false); anim.SetBool("yelling", false); anim.SetBool("jump", false); anim.SetBool("dance", true); } Now, save the project and click the Build and Run. It installs and launches the application in your Galaxy device which should be connected to your PC through a USB cable. After loading the AR Emoji, you can animate it by tapping any of the animation buttons. See the screenshots below from our demo project.

      Figure 16: The avatar is being animated.

      Conclusion
      Hopefully, you have enjoyed this tutorial and learned about AR Emoji using the AR Emoji SDK for Unity. You can use this knowledge to develop more entertaining applications that incorporate the AR Emoji avatars with other custom animation clips. You can also develop Android games using your own AR Emoji with this animation technique.
      View the full blog at its source
    • By STF News
      Maintaining the legacy of foldable technology, Samsung recently released the new Galaxy Z Fold2. This device is designed to provide a new and seamless experience to users with its Infinity Flex Display. As a developer, you can adjust your app to provide the best UI experience to your users.
      In this blog, we will demonstrate how a stopwatch app can be modified to adjust with Galaxy Z Fold2 devices. The stopwatch app is pretty simple, having three functionalities—start, pause and reset the time.


      Figure 1: Stopwatch app example
      In order to provide a seamless experience to the user, we have to ensure app continuity, adjust the activity according to the UI, support multi-resume in multi-window, and check the display cutout. So let’s get started.
      App continuity
      Like the previous Galaxy Z Fold, the new Galaxy Z Fold2 has two separate physical screens. The cover display is 2260 x 816 pixels and the main display is 2208 x 1768 pixels. To provide a seamless experience to the user while folding and unfolding the device, the app must maintain its continuity by preventing data loss. You can ensure continuity by using the onSaveInstanceState() method. First, save the data of the current state with onSaveInstanceState(). For the stopwatch app, the time that has passed is saved in seconds before the activity is paused.
      @Override public void onSaveInstanceState(Bundle savedInstanceState) { savedInstanceState.putInt("seconds", seconds); savedInstanceState.putBoolean("running", running); savedInstanceState.putBoolean("wasRunning", wasRunning); } Then restore the data of the activity using the onCreate() function.
      @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); if (savedInstanceState != null) { seconds = savedInstanceState.getInt("seconds"); running = savedInstanceState.getBoolean("running"); wasRunning = savedInstanceState.getBoolean("wasRunning"); } } Figure 2: Continuity of the stopwatch while folding and unfolding the device
      Ensure dynamic resizing of your app
      If you want your app to support multi-window mode, define the activity as resizable. This can be done by setting the resizableActivity attribute to true in Manifest.xml. This ensures the maximum compatibility with both the cover screen and the main screen of the device.
      <activity android:name=".MainActivity" android:resizeableActivity="true"> … </activity> Another approach is to define an aspect ratio for your app. Galaxy Z Fold2’s cover screen has a ratio of 25 : 9 whereas the main screen has a ratio of 22.5 : 18. To be compatible with the device, you should test your apps for these ratios and that they fill up the entire display.
      You can use the minAspectRatio or the maxAspectRatio flag to constrain your app within the feasible aspect ratios.
      Please note that, if the resizableActivity attribute is set to true, the minAspectRatio and the maxAspectRatio flag are ignored.
      Figure 3: Dynamically resizable app in Pop-Up view and Split-Window view
      Multi-Resume for multi-window
      Up to Android 9, only one of the activities visible in the multi-window operation is allowed to stay in the RESUMED state. All other activities are put into the PAUSED state. Therefore, you have to force your app to be in the RESUMED state by adding the following in the Manifest.xml file.
      <meta-data android:name="android.allow_multiple_resumed_activities" android:value="true" /> However, starting from Android 10, all activities visible in multi-window are allowed to stay in the RESUMED state. You no longer need to force your app to have multi-resume behavior. However, there are some cases where an app can be in the PAUSED state in Android 10, in which case you need to enforce the multi-resume behavior:
      • In a minimized split-screen (with launcher on the side), the top activity isn't resumed because it's not focusable
      • In picture-in-picture mode, the activity isn't resumed because it's not focusable
      • When activities are covered by other transparent activities in the same stack

      Figure 4: Multi-Resume in multi-window
      Display cutout
      The main display of the Galaxy Z Fold2 has a punch hole in the upper right side. You can set a display cutout mode according to your content style. By default, the content is rendered into the cutout area while in portrait mode and letterboxed while in landscape mode. If you want your content to be rendered into the cutout area in both portrait and landscape modes, you can define the mode as shortEdges. In the sample app, the cutout mode is set to shortEdges in the style.xml file. The sample app is set to full screen and display cutout mode is set to shortEdges.
      <item name="android:windowLayoutInDisplayCutoutMode">shortEdges</item> <item name="android:windowTranslucentNavigation">true</item>
      Figure 5: Display cutout Default mode



      Figure 6: Display cutout in shortEdges mode
      Hopefully this blog helps you to update your app for the Galaxy Z Fold2 and give the user a better UI experience. To check out the sample app, click here. You can also view Samsung’s official documentation for Galaxy Z devices. If you have any questions regarding foldable UI, feel free to post it in the Samsung Developers Forums.
      View the full blog at its source
    • By STF News
      Two of the things I really like about game developers are their constant drive for innovation and seemingly endless energy. And that cannot be more obvious than participating in an event like the IndieCade International Festival of Independent Games. For 15-plus years, IndieCade has been the premiere stand-alone gathering and showcase for indie games worldwide.
      This year, the festival had to completely rethink itself to accommodate our new global reality, and I am so happy to say that the organizers absolutely nailed their “Anywhere and Everywhere” motto. For nine days, the Samsung Developers team were immersed in game presentations and awards, networking events, game slams, and social media madness.
      Let’s take a look at some of the highlights of the event and my general musings about the games and participants.
      As an event sponsor, we had the privilege to collaborate with the IndieCade audience in several ways:
      Attending multiple live presentations Presenting the renewed Samsung Galaxy Store Creating a virtual Samsung Developers booth Watching the nominated games and the highly-anticipated award winners The Samsung Galaxy Store Presentation: Where Indie Games Get Discovered
      For obvious reasons, the biggest point of the event for me was to lead the presentation that introduced the new game-focused experience of Galaxy Store. Available in more than 180 countries with more than 400M month active users, Galaxy Store is a great place for developers to publish their game and get discovered. With the help of my colleague, Shelley Wu, we showcased the different opportunities that game developers have to promote and market their games, gave a hands-on walkthrough of the Games tab in the store, and showed how its design promotes discovery of new games.
      Additionally, I gave a live step-by-step demonstration on how to create a Seller Portal account, how to request commercial seller status, and the different steps an indie developer should follow to publish their first game.
      This presentation is filled with helpful information and tips, and I recommend you watch it.
      Live Presentations
      Pre-selected game developers were able to showcase their games and particate in the conference chat on IndieCade’s Twitch channel. It is hard to pick just a few games, but these will give you a pretty good idea of what you could expect:
      - Aria
      Aria is a story-based VR experience where you are placed in a pod to escape a catastrophic explosion on your spaceship. Watch Aria's stream
      - Endlight
      This one is more like a psychedelic, fast-paced experience. Watch Endlight's stream
      - Lost Words
      This one is a highly praised puzzle game that is set in the pages of a young girl’s diary, and you have to use words to advance the story. Watch Lost Words' stream
      - Thousand Year Old Vampire
      This one is for the fans of table top games. It is incredibly innovative and, unlike most TTRPGs, is a solo experience that allows you to create the story of the titular vampire. Watch Thousand Year Old Vampire's stream
      Nominees and Winners
      This year, IndieCade nominated over 50 games and in the end, there were 12 award winners. As a game designer myself, I particularly enjoyed reading the comments from the jury.
      If you'd like to dig more into IndieCade 2020, check out the awards video or the event round-up
      We love to chat with mobile game developers and help you as you develop, publish, and market your own games. Galaxy Store is a great place to publish your game and get discovered. If you are a game developer and want to request a quick chat with us, just fill out this form.
      View the full blog at its source
    • By STF News
      Start Date Nov 17, 2020
      Location Online
      Samsung is proud to sponsor and speak at Wacom's Connected Ink 2020 conference. Samsung's Catarina Cho will speak on a panel discussing "New Work and the Future of Digital Eduction". You can also partake in the digital drawing experience on Android. After the conference, be sure to check out the Galaxy S Pen Remote SDK.
      Connected Ink is an open innovation platform which takes you on a journey of creativity, art and music combined with the latest developments of digital transformation and ink technology. It is a hybrid online- and offline event which let‘s you connect and expand your network, to explore and discover unexpected creative sparks. Join Wacom in making the world a more creative place with future technologies.
      Connected Ink is a 24 hour online event, freely accessible from around the globe to you or in a live experience in Tokyo.
      View the full blog at its source


×
×
  • Create New...