‘Executive review board’ has final say on controversial App Store titles



 

Each week, an “executive review board” led by Apple marketing chief Phil Schiller meets to decide the fate of controversial App Store submissions, a report revealed on Friday.

The now-defunct Infowars app.

The now-defunct Infowars app.

The ERB establishes policy for Apple’s Worldwide Developer Relations division, often dubbed App Review for short, CNBC said. In the case of apps on the edge of rejection, the ERB is the end of the line for decision making. Normally appeals must pass through the regular App Review Board before getting that far.

It was reportedly Schiller and the ERB that banned Alex Jones’ Infowars from the App Store. While Infowars is infamous for things like calling the Sandy Hook school massacre a hoax, Apple used threats against a reporter as its reasoning. The company had taken some flak for pulling Infowars podcasts but leaving the app intact.

To support a growing workforce, new App Review offices recently arose in Cork, Ireland and Shanghai, China, an anonymous source said. The division is believed to have over 300 reviewers in all, and while it’s headquartered in Sunnyvale, Calif., teams are often fluent and/or specialized in non-English languages.

Schiller “rarely if ever” visits the offices where app reviews take place, CNBC continued. Day-to-day affairs are said to belong to VP Ron Okamoto, as well as an unnamed director who joined Apple after its TestFlight takeover in 2015.

The review process begins with reviewers “claiming” a group of apps through a Web portal called App Claim. Those apps are often tested on an iPad, even if it’s an iPhone app, though there are dedicated stations for testing Apple Watch and Apple TV apps as needed.

Beyond screening for bugs or illegal content, the reviewers check against the latest App Store guidelines and decide whether to accept, reject, or hold a submission. The whole procedure can take just a few minutes, since most apps are simple, multiple sources indicated.

Reviewers are allegedly under the gun to meet quotas between 50 and 100 apps per day, something tracked by an app called Watchtower. They can also be called to task for other criteria, such as whether decisions are later overruled, and whether they meet SLA (service-level agreement) goals of reviewing 50% of apps within 24 to 48 hours.

On July 30, 2018, the SLA rate fell to 6%, at which point App Review management announced it was “opening up” 12-hour days.

“Please note that you should not work over 12 hours in one day,” an internal email cautioned.





Exploring Android Q: Adding bubble notifications to your app


In 2018, Google added a new “chat head” feature to its iPhone application, which displayed the caller’s avatar as a floating bubble-style notification. When tapped, this bubble expanded to reveal a strip of controls that allowed the user to perform tasks directly from the notification, including putting the caller on speaker phone and hanging up.

In Android Q, Google is making “chat head” notifications an official part of the Android platform, with the introduction of the Bubble API. These bubbles can contain useful information about events that are happening elsewhere in your app, but they can also contain custom actions. These actions allow the user to interact with your app, even when they’re viewing another Activity, application, or they’re located in an unrelated part of the Android operating system.

In this article, I’ll share everything you need to know about this upcoming Android Q feature, including what bubbles have to offer the developer and the end-user, best practices, and some limitations you need to be aware of, before you start using bubbles in your own Android apps.

By the end of this article, you’ll be up to speed with this new Android Q feature, and will have created an Android app that features its own bubble notifications.

What are Android Q’s bubbles?

Bubbles display your app’s content in a window that appears to “float” above the existing foreground Activity.

In its collapsed state, a bubble notification is represented by a small icon. These icons are plain white by default, but you can customize them with an image, for example you might use your app’s icon, or the avatar of the person who’s associated with this notification.

Bubble notifications appear as collapsed icons in their default state

When the user taps a collapsed bubble, an intent will be invoked and your bubble will be displayed in its expanded state, which typically contains additional information and may also provide access to some related functionality.

Clicking a bubble icon will reveal its expanded layout

When a bubble is expanded, the associated application becomes the foreground process, if it isn’t already.

Users can interact with a bubble without having to navigate away from their current Activity, which makes bubbles a powerful way to re-engage users, and potentially draw them back to your app.

Even if the user is already inside your app, a bubble can help them quickly and easily respond to important events that are happening elsewhere in your application. For example, imagine you’ve developed a messaging app, and the user receives a message from Contact B, when they’re midway through drafting a message to Contact A. Rather than forcing them to navigate to the Activity where this event occurred, you can present Contact B’s message as a bubble notification, and the user can then read and respond to that message without having to navigate away from their draft.

Unless the user explicitly dismisses a bubble by dragging it offscreen, that bubble will remain visible even if the user navigates between different applications and areas of the operating system. Since bubbles are a persistent part of the Android user interface (UI), they can provide a convenient place to store notes or manage ongoing tasks, for example you might store the user’s To Do list or travel itinerary inside a bubble, so it’s always within easy reach.

You could even use bubbles as reminders, for example your app might generate a bubble when it’s time for the user to log into a meeting, send an important email, or perform some other time-sensitive task.

Haven’t Facebook been using bubble notifications for years?

Floating bubble-style notifications aren’t a new concept for Android, as they’ve long been available in third party apps, most notably in Facebook Messenger. However, previously it was the developer’s responsibility to design and implement their own bubble notifications.

Creating a custom feature is always more time-consuming than leveraging classes and APIs that are already built into the Android platform, so now that bubbles are officially part of Android it should be much easier for developers to use this notification style. This official support will also provide a more consistent experience for users, as all bubbles should now have exactly the same behaviour, regardless of the application that generated them.

Android Q bubbles: What are the restrictions?

Bubbles are displayed on top of whatever content the user is currently viewing. If your app generates a large number of bubbles, or it creates unnecessary bubble notifications, then users are quickly going to lose patience with your app.

Someone who feels bombarded by bubbles may choose to disable the bubble feature for your application, or they may even uninstall your app entirely.

To safeguard the user experience, your bubble notifications will only be displayed if they meet at least one of the following criteria:

  • Your application is in the foreground when the notification is sent.
  • The notification has a Person added. If there are multiple people associated with a notification, then you must also mark this conversation as a group, using setGroupConversation(boolean).
  • The notification is from a call to Service.startForeground, has a Person added, and falls into the CATEGORY_CALL notification category, which indicates this is a synchronous communication request, such as a voice or video call.

If none of these conditions are met, then your bubbles will be displayed as a standard notification instead. If the device is locked or its always-on display is active, then your bubbles will again only appear as standard notifications.

You should also be aware that at the time of writing, bubbles were an optional feature. When your application first tries to generate a bubble, the user will be presented with a permissions dialog and they’ll have the option to disable bubbles for your application. If the user disables the bubble feature, then your app’s bubbles will always be displayed as standard notifications, even if they fulfil all of the above criteria.

What we’ll be creating

In this article, we’ll build an application that uses Android Q’s new bubble notifications feature. To make our app easier to test, it’ll feature a button that generates a bubble notification every time its tapped.

We'll be creating an app that triggers a bubble notification on-demand

Since chat applications are the most obvious choice for bubbles, our app will simulate the user receiving a new message, similar to the Facebook Messenger app. When expanded, this bubble will include a space where the message would be displayed, plus two actions that the user can perform: call this contact, or send them a text response.

Android Q's bubble notifications can contain information and actions

To experiment with this new feature, you’ll need the latest preview of Android Studio 3.5. You’ll find the latest version over at the Preview Release website.

You’ll also need the Android Q preview SDK and Android SDK Build-Tools 28, or higher:

  • Select “Tools > SDK Manager” from the Android Studio toolbar.
  • In the subsequent window, select the “SDK Platforms” tab.
  • Select the latest release of “Android Q Preview.”
  • Switch to the “SDK Tools” tab.
  • Select “Android SDK Build-Tools 28,” or higher.
  • Click “OK” to install these components.

Note that the following tutorial was created using Android Q Beta 2, when bubble notifications were still considered an experimental feature. If you’re using a later version of Android Q, then you may encounter some minor differences.

Building our Android Q app

To get started, create a new Android project using the “Empty Activity” template, and when prompted make sure your app is targeting the latest version of Android Q.

If you’re adding bubbles to an existing application, then you’ll need to open your project’s build.gradle file and upgrade compileSdkVersion, minSdkVersion and targetSdkVersion to “android-Q.”

android {
   compileSdkVersion 'android-Q'
   defaultConfig {
...
       minSdkVersion 'Q'
       targetSdkVersion 'Q'
...
   }
...
}

Next, open your build.gradle file and add the latest version of the Material Components for Android library to your “dependencies” block:

dependencies {
   implementation fileTree(dir: 'libs', include: ['*.jar'])
   implementation 'androidx.appcompat:appcompat:1.0.2'
   implementation 'androidx.constraintlayout:constraintlayout:1.1.3'

//Add the following//

   implementation 'com.google.android.material:material:1.1.0-alpha07'
   testImplementation 'junit:junit:4.12'
   androidTestImplementation 'androidx.test.ext:junit:1.1.0'
   androidTestImplementation 'androidx.test.espresso:espresso-core:3.1.1'
}

Creating the main user interface

Our project will eventually need two layouts: one for the main application, and one that defines the layout of our expanded bubble.

Open your project’s activity_main.xml file, and let’s create the button that’ll generate our bubble notification:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
   android:layout_width="match_parent"
   android:orientation="vertical"
   android:gravity="center"
   android:layout_height="match_parent">

   <Button
       android:id="@+id/createBubble"
       android:layout_width="wrap_content"
       android:layout_height="wrap_content"
       android:text="Create a bubble notification" />

</LinearLayout>

Building a bubble notification

Next, we need to create the bubble notification. Android Q’s bubbles are built on top of Android’s existing notification system, so if you have any previous experience of working with Android notifications, then creating a bubble should feel instantly familiar.

You create an Android Q bubble, by completing the following steps:

1. Create at least one notification channel

Android 8.0 introduced the concept of notification channels, where all notifications that are posted to the same channel have the same behaviour.

Since our application is targeting Android 8.0 or higher, all of our notifications must be assigned to a notification channel, including bubbles.

To create a notification channel, you need to construct a NotificationChannel object and pass it:

  • An ID, which must be unique to your package.
  • The channel’s name, which will be displayed to the user via the channel’s settings screen.
  • An importance level. In Android Oreo and higher you can no longer set the priority level for individual notifications. Instead, you must specify the channel’s importance level, which is then applied to every notification that’s posted to that channel. Bubble notifications must be assigned a level of IMPORTANCE_HIGH, as this ensures the bubble will appear onscreen, regardless of what the user is currently doing.

Android Q also introduces a setAllowBubbles() method, which allows you to specify that this channel supports bubbles (“true”). The setAllowBubbles() value will be ignored for channels that have an importance level of IMPORTANCE_DEFAULT or lower, so you must mark you channel as setAllowBubbles(true) and IMPORTANCE_HIGH.

In the following snippet, we’re creating our notification channel. This is also your chance to specify any additional desired behavior, such as whether notifications posted to this channel should cause the device’s LEDs to flash.

       CharSequence name = "My new channel";
       String description = "Description";
       int importance = NotificationManager.IMPORTANCE_HIGH;

//Create the channel object//

      channel = new NotificationChannel("1", name, importance);
      channel.setDescription(description);
      channel.setAllowBubbles(true);

You can then submit this NotificationChannel object to the NotificationManager, using the createNotificationChannel() method:

               notificationManager.createNotificationChannel(channel);

2. Create the bubble intent

Later in this tutorial, we’ll create a BubbleActivity that’ll launch every time the user interacts with the bubble icon.

In the following snippet, we’re creating a PendingIntent, which specifies the Activity that’ll be displayed inside our expanded bubble:

               Intent target = new Intent(MainActivity.this, BubbleActivity.class);
               PendingIntent bubbleIntent =
                       PendingIntent.getActivity(MainActivity.this, 0, target, PendingIntent.FLAG_UPDATE_CURRENT /* flags */);

3. Create the BubbleMetaData

Next, you need to create a BubbleMetadata object, which will encapsulate all the data required to display our notification bubble.

You create a BubbleMetadata object by calling the Notification.BubbleMetadata.Builder constructor. We can then use setIntent() to specify the target bubble intent, which will run every time the user interacts with this bubble.

               Notification.BubbleMetadata bubbleData =
                       new Notification.BubbleMetadata.Builder()
...
...
...
                            .setIntent(bubbleIntent)
                            .build();

When building a BubbleMetadata object, we also need to set the icon that’ll represent this bubble in its initial, collapsed state, using the Notification.BubbleMetadata.Builder.setIcon(Icon) method. You must provide an icon for every bubble that your application creates, and this icon should be representative of the bubble’s content.

The shape of the bubble icon is adaptive, and can be modified to match the device’s theme. Note that if your icon is bitmap-based, then you’ll need to use createWithAdaptiveBitmap, which will ensure your icon is generated according to the design guidelines defined in the AdaptiveIconDrawable class, or <adaptive-icon> tags.

We can also set a desired height for the bubble’s content, although this value will be ignored when there isn’t enough onscreen space available.

This gives us the following:

               Notification.BubbleMetadata bubbleData =
                       new Notification.BubbleMetadata.Builder()
                               .setDesiredHeight(600)
                               .setIcon(Icon.createWithResource(MainActivity.this, R.drawable.ic_message))
                               .setIntent(bubbleIntent)
                               .build();

4. Add the metadata to the bubble

Next, we need to attach the BubbleMetadata object to our notification.

Android Q adds a new setBubbleMetaData() method to the notification builder class. This method takes an instance of BubbleMetadata, which is used to display your bubble’s content when it’s in an expanded state.

               .setBubbleMetadata(bubbleData);

The completed MainActivity

After completing all the above steps, your MainActivity should look something like this:

import androidx.appcompat.app.AppCompatActivity;
import android.app.Notification;
import android.app.NotificationChannel;
import android.app.NotificationManager;
import android.app.PendingIntent;

import android.content.Context;
import android.content.Intent;
import android.graphics.drawable.Icon;
import android.os.Bundle;
import android.widget.Button;
import android.view.View;

public class MainActivity extends AppCompatActivity implements View.OnClickListener {

   Button createBubble;

   Notification.Builder builder;
   NotificationManager notificationManager;
   NotificationChannel channel;

   @Override
   protected void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);

       createBubble = findViewById(R.id.createBubble);

       notificationManager = (NotificationManager) getSystemService(Context.NOTIFICATION_SERVICE);

       CharSequence name = "My new channel";
       String description = "Description";
       int importance = NotificationManager.IMPORTANCE_HIGH;

//Create the channel object//

       channel = new NotificationChannel("1", name, importance);
       channel.setDescription(description);
       channel.setAllowBubbles(true);

       createBubble.setOnClickListener(this);
   }

   @Override
   public void onClick(View view) {

       switch (view.getId()) {
           case R.id.createBubble:

//The Activity that’ll be displayed inside our expanded bubble//

              Intent target = new Intent(MainActivity.this, BubbleActivity.class);

//Create a PendingIntent//

               PendingIntent bubbleIntent =
      PendingIntent.getActivity(MainActivity.this, 0, target, PendingIntent.FLAG_UPDATE_CURRENT /* flags */);

//Create a BubbleMetadata object//

               Notification.BubbleMetadata bubbleData =
                       new Notification.BubbleMetadata.Builder()

//Specify the bubble’s desired height//

              .setDesiredHeight(600)

//Specify the bubble’s icon//

             .setIcon(Icon.createWithResource(MainActivity.this, R.drawable.ic_message))

//Specify the target bubble intent//

                               .setIntent(bubbleIntent)
                               .build();

               builder = new Notification.Builder(MainActivity.this, channel.getId())
                      .setSmallIcon(R.drawable.ic_message)

//Add the BubbleMetadata object//

                      .setBubbleMetadata(bubbleData);

//Submit the NotificationChannel to NotificationManager//

               notificationManager.createNotificationChannel(channel);
               notificationManager.notify(1, builder.build());
               break;

       }

   }
}

Creating the bubble icon

Our MainActivity references a “ic_message” drawable, which will be used to represent our bubble in its initial, collapsed state. Let’s create this icon now:

  • Select “File > New > Image Asset” from the Android Studio toolbar.
  • Open the “Icon Type” dropdown and select “Action Bar and Tab Icons.”
  • Make sure the “Clip Art” button is selected.
  • Give the “Clip Art” button a click.
  • Choose the image that’ll represent your bubble notification; I’m opting for “message.”
  • Click “OK.”
  • In the “Name” field, enter “ic_message.”
  • Click “Next.” Read the onscreen information, and if you’re happy to proceed then click “Finish.”

While we’re here, let’s create the other image assets that we’ll be using throughout this tutorial. Our expanded bubble will eventually use two icons to represent two distinct actions: calling the contact, and sending them a text response.

To create these drawables, repeat the above steps, but this time:

  • Select an image that’ll represent the bubble’s “call” action. I’m using the “mic” resource and naming it “ic_voice.”
  • Select an image that’ll represent the bubble’s “reply to message” action. I’m using the “reply” drawable, and naming it “ic_reply.”

Building the bubble Activity

Next, we need to create the Activity that’ll be displayed to the user every time they interact with our bubble.

  • Select “File > New > Java Class” from the Android Studio toolbar.
  • In the subsequent window, name this class “BubbleActivity.”
  • Click “OK.”

We’ll use this class to define the bubble’s content, including any actions the user can perform by interacting with the expanded bubble. To help keep our code straightforward, I’ll simply display a toast every time the user triggers the bubble’s “sendMessage” and “voiceCall” actions.

Open your BubbleActivity class, and add the following:

import androidx.appcompat.app.AppCompatActivity;
import android.os.Bundle;
import android.widget.ImageButton;
import android.widget.Toast;
import android.view.View;

public class BubbleActivity extends AppCompatActivity implements View.OnClickListener {

   @Override
   protected void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_bubble);
       ImageButton voiceCall = (ImageButton) findViewById(R.id.voice_call);
       voiceCall.setOnClickListener(this);
       ImageButton sendMessage = (ImageButton) findViewById(R.id.send);
       sendMessage.setOnClickListener(this);
   }

   @Override
   public void onClick(View v) {
       switch (v.getId()) {

           case R.id.voice_call:
               Toast.makeText(BubbleActivity.this, "Calling contact", Toast.LENGTH_SHORT).show();
               break;
           case R.id.send:
Toast.makeText(BubbleActivity.this, "Sending message", Toast.LENGTH_SHORT).show();
               break;
       }
   }
}

Designing the expanded bubble layout

Now, we need to create a corresponding layout for our BubbleActivity. This layout will consist of:

  • A RecylerView. In a real-world messaging app, this is where we’d display the newly-received message, plus any previous messages.
  • An EditText. This will enable the user to type their response directly into the bubble notification.
  • Two ImageButtons. These will display icons that the user can tap, in order to send a text response or call the person who sent this message.

Create a new layout file named “activity_bubble,” by Control-clicking your project’s layout directory and then selecting “New > Layout resource file” from the Android Studio toolbar.

Open your “activity_bubble.xml” file, and add the following:

<?xml version="1.0" encoding="utf-8"?>
       <LinearLayout
       xmlns:android="http://schemas.android.com/apk/res/android"
       xmlns:app="http://schemas.android.com/apk/res-auto"
       android:id="@+id/newMessage"
       android:layout_width="match_parent"
       android:layout_height="match_parent"
       android:orientation="vertical">

       <androidx.recyclerview.widget.RecyclerView
           android:id="@+id/messages"
           android:layout_width="match_parent"
           android:layout_height="0dp"
           android:layout_weight="1"
           android:scrollbars="vertical" />

       <LinearLayout
           android:id="@+id/input_bar"
           android:layout_width="match_parent"
           android:layout_height="?attr/actionBarSize"
           android:orientation="horizontal">

           <ImageButton
               android:id="@+id/voice_call"
               style="?attr/buttonBarNeutralButtonStyle"
               android:layout_width="wrap_content"
               android:layout_height="match_parent"
               android:onClick="onClick"
               android:tint="?attr/colorAccent"
               app:srcCompat="@drawable/ic_voice" />

           <EditText
               android:id="@+id/input"
               android:layout_width="0dp"
               android:layout_height="match_parent"
               android:layout_weight="1"
               android:hint="Enter message"
               android:inputType="textCapSentences" />

           <ImageButton
               android:id="@+id/send"
               android:onClick="onClick"
               style="?attr/buttonBarNeutralButtonStyle"
               android:layout_width="wrap_content"
               android:layout_height="match_parent"
               android:tint="?attr/colorAccent"
               app:srcCompat="@drawable/ic_reply" />

      </LinearLayout>

      </LinearLayout>

Multi-window and document UI: Updating the Manifest

If Android is going to recognize BubbleActivity as an expanded bubble, then we need to open our Manifest and make a few changes to its “BubbleActivity” declaration.

1. Add multi-window support

Start by specifying that your BubbleActivity supports Android’s multi-window display:

    android:resizeableActivity="true"

2. Enable allowEmbedded

Bubbles are displayed inside a container that’s owned by another Activity, so our next task is declaring that BubbleAtivity can be launched as the embedded child of another Activity:

       android:allowEmbedded="true"

3. Allow multiple instances

Sometimes, your application may need to display multiple bubbles of the same type.

Since we’re creating a chat application, there’s a chance the user may receive multiple messages from different people simultaneously. To avoid confusion, it’s important we represent each conversation as its own bubble, even if that means having multiple bubbles visible onscreen.

If you want your application to display multiple bubbles of the same type, then it must be capable of launching multiple instances.

To give your app the ability to create multiple instances, add the following to your “BubbleActivity” declaration:

       android:documentLaunchMode="always"

The completed Manifest

After performing all of the above steps, your Manifest’s “BubbleActivity” section should look something like this:

   <activity
       android:name=".BubbleActivity"
       android:label="@string/title_activity_bubble"
       android:allowEmbedded="true"
       android:documentLaunchMode="always"
       android:resizeableActivity="true"
       android:theme="@style/AppTheme.NoActionBar"/>
</application>

Testing your Android Q bubbles

To test your bubble notifications, you’ll need either a physical device that’s running the Android Q preview or higher, or an Android Virtual Device (AVD) that’s configured to support Android Q.

To create a compatible AVD:

  • Select “Tools > AVD Manager” from the Android Studio toolbar.
  • Select “Create Virtual Device…”
  • Choose the device definition that you want to use, and then click “Next.”
  • On the “Select a system image” screen, choose the latest “Q” system image. If you haven’t already downloaded Android Q, then click its accompanying “Download” link and wait for the system image to be downloaded to your machine.

Download an Android Q system image

  • Give your AVD a name, and then click “Finish.”

To put your application to the test:

  • Launch your app on a compatible AVD or physical Android device.
  • Give the “Create a bubble notification” button a tap. A bubble should now appear onscreen.
  • Give the bubble icon a click, to view it as an expanded bubble.
  • If prompted, grant your application permission to display bubbles, by tapping “Allow.”
  • Give the bubble’s “call” action a click, and a “Calling contact” toast should appear.
  • Try clicking the “reply” action; a “Sending message” toast should now appear.

You can download the completed project from GitHub.

Creating automatically-expanded bubbles

Currently, all of our application’s bubbles appear in a collapsed state, and will only be expanded if the user interacts with them. However, it’s possible to create bubbles that launch in their expanded state automatically.

Typically, you should only configure a button to appear in an expanded state, if the user performs an action that directly results in that bubble, such as tapping a button to launch a new chat window, or create a new document.

You can create an expanded bubble, by adding setAutoExpandBubble(true) to your BubbleMetadata object.

Just be aware that this bubble will only be posted in an expanded state, if its related application is in the foreground. If the app that created this bubble isn’t in the foreground, then the setAutoExpandBubble() method will be completely ignored.

In the following snippet, we’re declaring that the bubble’s contents should be expanded automatically:

Notification.BubbleMetadata bubbleData =
       new Notification.BubbleMetadata.Builder()
               .setDesiredHeight(600)

//Add the following line//

               .setAutoExpandBubble(true)
               .setIcon(Icon.createWithResource(MainActivity.this, R.drawable.ic_message))
               .setIntent(bubbleIntent)
               .build();

Install the updated project on your AVD or Android device, and give the “Create a bubble notification” button a tap. Instead of the bubble icon appearing onscreen, your bubble should now launch in its expanded state automatically.

Getting the most out of bubbles: Best practices

As with every new feature, bubbles come with their own set of best practices.

When adding bubble notifications to your Android apps, it’s important to bear the following in mind:

1. Don’t overwhelm the user

Bubbles take up a significant amount of screen real estate, and have the potential to interrupt whatever the user is currently doing.

If you bombarb the user with bubbles, then in the best case scenario they’ll block your application from issuing any bubbles, and in the worst case scenario they may even uninstall your app entirely.

To avoid alienating your users, you should only issue bubble notifications for events that are important enough to warrant the user’s immediate attention.

2. Focus on simplicity

All processes that are launched from a bubble are housed within that bubble’s container, which can often be considerably smaller than a regular Activity.

To provide a good user experience, you should avoid the temptation to pack your bubbles full of information and features, and instead create bubbles that are as lightweight and straightforward as possible.

3. Test your bubbles as regular notifications

There are circumstances where your bubbles will be presented to the user as a standard notification, for example if the device is locked or the always-on display is active.

To ensure a good user experience regardless of how your bubble is presented, you should test how each of your bubbles appears and functions when it’s displayed as a bubble notification and as a regular notification.

Wrapping up

In this article, we saw how you can start using Android Q’s bubbles feature today. Over the course of this article, we’ve created an application that triggers collapsed and expanded bubbles on-demand, and populated the expanded bubble with Views and custom actions.

What other Android Q features are you looking forward to trying? Let us know in the comments below!



Lightning Labs Launches Lighting Mobile App for Bitcoin Micropayments on the Go


Lightning Labs has taken one step closer to bringing Lightning Network payments to mobile with the launch of the Lightning App for iOS and Android in alpha on the Bitcoin main net. It follows the recent launch of the Lightning App on desktop, bringing the total number of operating systems supported by the program to five.

Lightning Network is a second layer scaling solution aimed at reducing the number of transactions burdening the Bitcoin blockchain. It seeks to provide almost instant, free transactions and once fully tested, the launch of the mobile-facing application will allow more users than ever to get to grips with the technology.

Lightning App Brings Bitcoin Micropayments to iOS and Android

The Bitcoin scaling solution known as Lightning Network is having a great 2019 thus far. Already, the micropayments network has received considerable exposure from the cryptocurrency community and beyond thanks the publicity-gathering stunt, the Lightning Torch; had Twitter’s founder and CEO singing its praises; and has had its first desktop application recently deployed for it.

The latest piece of positive news from Lightning Labs, one of the main groups of developers contributing code to the project, is the launch of its Lightning App on the Bitcoin main net for iOS and Android mobile devices. The announcement, made today via a Lightning Labs blog post, makes it the first main net app with iOS, Android, macOS, Windows, and Linux support.

The post details the major considerations Lightning Labs took when creating its first mobile-facing applications – amongst the most important of these was user interface design.

The team are well aware that there are currently hurdles standing in the way of a seamless user experience with both Bitcoin and Lightning Network. With this in mind, the application was designed to make onboarding new, maybe slightly less technical users as easy as possible.

Naturally, huge import has also been given to user security. By sandboxing the applications, Lightning Labs claims to have made the apps even more secure than their desktop counterparts. They also use the features of the hardware on devices themselves to make the applications even less vulnerable to compromise. Currently, the private key of iOS users is password protected and encrypted on the iPhone’s Secure Enclave. There is currently no working Android equivalent, although there are plans to support advanced hardware such as the Pixel 3’s Titan M security Module in a similar fashion.

Although the launch of the Lightning App on mobile is certainly an exciting day for Bitcoin and its second layer scaling solution, Lightning Labs has warned users to exercise caution when using the app. The Lightning Network itself and any applications built on top of it are still in very early stages of development. Users wanting to experiment with either the app or the network are advised to only do so with money that they are comfortable with losing.

 

Related Reading: Jack Dorsey Believes the Internet Will Soon Have One Currency, But Will It Be Bitcoin?

Featured Images from Shutterstock.





How To Gain a Competitive Advantage With Your Enterprise Mobile App Strategy? 5 Tips to Follow


Photo by Scott Webb

How effective is your enterprise mobile app strategy exactly? Every digital marketer just loves to talk about consumer-facing apps and they probably have a good reason for it. They share your brand’s message and value proposition with the rest of the world.

But there’s one more thing that remains somewhat overlooked. Yet, it’s something that can provide smart companies with a definitive edge over their competition by easing tasks, increasing work productivity, and building connections with consumers like never before. So, say hello to enterprise mobile apps.

A survey across 1,500 enterprise decision-makers conducted by Adobe revealed that 61 percent of them tend to believe that companies put themselves at risk by not embracing mobile applications.

What’s more interesting, however, is that 66% of them also said that they are lagging behind their competitors when it comes to their enterprise mobile apps. Hence, this is where the importance of a proper and precise strategy steps into the picture.

You probably know where this is headed to. Following are a few tips for creating a cutthroat enterprise mobile app strategy that could give you the competitive edge you so desperately need.

Focus On Mission-Critical Applications

Citing Adobe’s study, the high-ranked officials who took part in it came up with a few types of enterprise mobile apps that they consider to be mission-critical. These include:

– Sales Enablement
– Messaging and Collaboration
– Customer Relationship Management
– Customer Service and Support

In order to identify mission-critical apps, however, you’d have to look at your very own organization. You need to map out the fields which need improvement and focus on those which tend to yield the most benefit.

Photo by rawpixel.com

First things first, you can start by supplementing or replacing your existing core desktop applications with those which are geared toward mobile. Once you do this, you should expand appropriately to areas where mobile, as well as device-specific features, will be yielding competitive benefits and additional productivity.

Centralize The Management of Your App

Creating a centralized, unified dashboard can be the perfect way to get through emerging obstacles quickly and effectively. Undertaking a rather holistic approach will enable you to identify traffic patterns which could include usage errors, security anomalies, and whatnot.

As part of your strategy, you’d have to consider establishing a core team which will oversee the top concerns. Some of these challenges include:

– Lack of customization within the app
– Security
– Errors caused by updates

Understand The View of Your Organization

Tapping deeper into Adobe’s survey, the results suggested that employees who use enterprise mobile apps tend to feel up to date, productive, and empowered.

However, that’s not always the case. The efficient adoption of new technology can be complex and it could also cause feelings of distraction, helplessness, and confusion. Therefore, it’s critical to keep constant communication within the organization itself.

Rallying your teams around the deployment of enterprise mobile apps aimed at ensuring employee satisfaction will help to reach an internal consensus, which is invaluable.

Deploy Critical Capabilities Needed For Success

A key step towards accomplishing this is to shake off all mobile app development myths circling around. By doing so, you’d be able to determine the features which will yield the best results for your brand and deploy them for success.

According to the above study, 48% of the respondents actually name security as a top priority. And there are plenty of good reasons for that. Right off the bat, mobile devices are always on the go and a lot of them contain proprietary or confidential information that shouldn’t get into the wrong hands.

Another thing to consider is the integration with other business systems. The value of your enterprise mobile app isn’t only in displaying content, but also in its easy integration which allows employees to actually do something with that content.

Make Sure Your App Is ‘Future-Proof’

Photo by LYCS Architecture

By now you’ve probably understood that the pace of change is rapid. Hence, future-proofing your enterprise mobile app is absolutely essential as far as your competitive strategy goes. Stay informed on privacy and security safeguards, emerging technologies which can improve overall user experience, and always strive to implement features which are of aid to your organization.

These could include:

– Customer Relationship Management (CRM) Platforms
– Enterprise Resource Planning (ERP)
– Effective Collaboration Tools

In order to stay ahead of your time, it’s also important to look at what you’ve already done. How are your current apps designed to be used? What are their existing workflows? How can the become better?

Conclusion

Enterprise mobile app development is no news. A lot of the companies out there are already all over it. However, few are those who leverage the power of a proper and robust strategy.

The above are just a few of the tips you can take advantage of in order to give yourself a nice push forward. Staying constantly informed, however, is what can really get you ahead of the game.



How to build an app that is secure, robust and scalable


When it comes to development, one of the major factors that are kept in mind is scalability. Apart from this, security and robust nature of an application are counted as a chief part. It is essential to be seamless in scaling when it is about designing a website or an application.

It will be ideal irrespective of a decrease or increase in demand rate. The Cyber Security Services works on such computing resources that arein a secure but scalable form. As a matter of fact, the ubiquitous nature of applications makes it even more focused. But not everyone installs it on their phone that is built upon man-hours, resources, meeting, and coffee.

Mobile App Development Services for different factors

The maintainability, scalability and even reliabilityare not easy to come up with the development mode of an application. But with the best possible approach and secure environment, it is possible to achieve it while investing some money in codes and architecture. The fact is that the well-design code and application architecture makes a difference in marketing, conversions, and development of the product.

There are different factors that havean impact on different features such as fast iterations, prototypes, development speed, and even feature validations. These are the one that has the ability to enhance e-commerce metrics and conversion rate. The minute stimulation is noted up when it is mainly about implementations and requirements. It will require to be extremely data-driven, which has become a necessity in the present world. Apart from this, the building blocks are developed over the functionalities.

The main things that are covered up by the Custom Mobile Application Development are:

• Data is stored up for using purpose later on by an application or third-party. They usually depend on databases to do so.

• The caches are also essential to ensure that expensive operations can be remembered easily. This is the best way to speed up the process in an application.

• The search indexes are also vital when the keywords are used for the search process. There might be a different mode of searching a file and hence, it will require some index sort.

• The stream processing comes up as the next essential part that is dependent on the message process. One has to send out the messages from one form to another in the form of asynchronously handled.

• The batch processing is a part of overall crunching of overall data accumulated by the application.

Some of the major factors on which the whole application is build to enhancing robustness and security are:

• Reliability

One of the major things that are counted by the cyber intelligence services is on how reliable it is. The motive is to ensure that it is working correctly and is in the adversity mode.

• Scalability

Another factor that is added up is to mainly grow in terms of complexity, traffic and data volume. These are the main factors on which the dealing is done to enhance the growth of the overall application.

• Maintainability

It is the concept that works on the maintenance of an application. The system works on the adaptable nature of an application. This is mainly for the fact that the application is working up in a productive manner.

Characteristics that affect applications

• Framework load

It is a vital part that has added up to the scalability with a limited overall span framework. This overall performance has its own effect on the features for enhancement purpose.

• Architecture

The Cyber Security Services works on these modes for scaling application while designing part.

• Sustainable load testing

Overall performance and load while eliminating the overall application function that might be bottlenecking the processes. The main goal is to boost stable growth for overall performance stability.

• Sustainable design

One of the major factors that fall up in the scalability is the quality of code that effects the overall design.

• Third-party integration

It is the one through which the failures and bottleneck operations are tested off in the Mobile App Development Services.

• Hardware limitations

The fact is that scalability has a huge effect on the software. The hardware actually plays a vital role in the overall process.

The architecture of the application

Since now the overall factors are cleared up, now comes how to process this overall architecture of an application. The essential part to keep in mind are the following:

• Backend and frontend

The essential part of the design is to set up the frontend and backend of an application. It can be done with the process of user interaction to configure the overall hardware on different levels. The end users will be able to access the overall process easily with the help of processes.

• Multitier scalability

As a matter offact, the software model works up on the multi-tier model of the overall process. The client can easily connect to the application server and get a database server towork on the overallCustom Mobile Application Development process. The layers that are added up are in intrinsic functions to work on stability and performance.

Methods and strategies to work on the application development

The overall process of development works upon the application by using different techniques.

• It focuses on the nodes functioning and independent features. Their motive is to work on the salable technique to work on the application and achieve it all.

• Even the load balances are kept in mind while working on the features and its different aspects to achieve it all. The distributive function helps in focusing up the loads that have the ability to enhance the connection and its distribution of the aspects.

• The proxy setting is handled correctly to ensure that there is no issue in coordinating request on multiple servers. It is mainly to focus on queries but to eliminate it so that the lower database is implemented.

• It has queues system that helps in working up as a procedure and not to work in a slower manner.

These are the major techniques that are followed by cyber intelligence services. But, it takes its own time to get the best possible result and work upon its basic fundamentals.



Apple Unveils New Technologies for App Development


Apple Unveils new Development

Introduction

Technology is at the very heart of human progress and development and it accounts for much of the economic and social progress of the past few centuries. In this series, in 1984, by introducing Macintosh, Apple revolutionized the so-called passive pattern of personal technology. From then on, a continuous and expeditious transformation in technology has shaped our pattern of communication, work, and lives. In the last years, we have witnessed an exponential investment from major enterprises in mobile app development solutions with the possibilities of dominating the world.

With 27.5 billion mobile applications downloaded by 2 billion smartphones users, growth of mobile development rocketed with incredible compatibility and innovation. So In today’s competitive world, it becomes mandatory for every progressive enterprise to get integrated with development in modern technology with potential vision. Here, at McEnery Convention Center, San Jose, California, is Apple prominent summer event, WWDC, Apple introduced significant and innovative new technologies for app developers to develop app dramatically easier, faster and efficiently.

SwiftUI

The fundamental vision behind Swift™ has been faster, simple and more interactive app development by a modern UI framework. SwiftUI offers an extremely intuitive and potential new UI framework to build complex app user interfaces by using easy to understand and simple declarative code. It saves the time of the app developers by providing numerous automatic functions including accessibility, right-to-left language, Dark Mode, interface layout, internationalization, and support. Apple developers can develop rich iOS app easily and more quickly using SwiftUI because of its similar API built into iPadOS ™, iOS, macOS®, tvOS™, and watchOS®.

Xcode 11

Xcode® 11, encompasses a new graphical user interface design tool so that UI designers can assemble a UI with SwiftUI quickly with no need of writing any code. Visual design tool instantaneously shows any change done to UI, by modification in the code. Now iOS app developers are able to see real-time previews of how the user interface would look and behave at the time of assembling, testing and refining their code. The necessary collaboration between UI designers and software developers and UI development become more productive due to a seamless move between writing code and graphical design. It accentuates the objectivity of new announcements at WWDC for iPhone app development company by allowing them to see by previews, which directly run on connected Apple devices, that how the app works with onboard sensors and the camera, and the response to Multi-Touch™. It supports on-device debugging for tvOS 9 and later, watchOS 2 and later and iOS 8 and later.

iPad Apps to Mac

An introduction of new API’s and tools make it easy to bring iPas applications to Mac. It invariably offers an iOS app developer to use Xcode, which enables developers to open an active or existing iPad project. The developers can add windowing features and fundamental Mac by a simple check to a box. Additionally, It saves developers valuable time by sharing a similar project and their source code which translates any changes in code to both the macOS and iPadOS versions. Users are offered each platform’s unique capabilities like speed and precision while using Mac’s mouse, keyboard, features like Touch Bar™ and trackpad.

Augmented Reality

At WWDC, Apple unveiled the next generation ARKit, which came up with some new features like motion capture, people occlusion, simultaneous front and back camera use, multiple face tracking and more. Motion capture enables developers to use captured peoples’ movements in real time as an input for AR experiences. In other words, ARKit provides a new framework to allow developers to create AR applications on iOS more easily. However, the augmented reality app quality depends on the details of the device’s physical environment. On the other hand, through People occlusion, it becomes so opportunistic for every iPhone app development company with a simplified process to mix virtual object and people and an immersive AR experience with green screen style effects for users. Furthermore, now iOS users can experience simultaneous use the front and back camera and with collaborative sessions makes it faster to build a collaborative world map with shared AR experience among multiple people.

Core ML and Create ML

Applications can easily deliver an incredible and astonishing experience that intensely understand speech, natural vision and natural language using CoreML3 which supports advanced real-time machine learning models. Evidentially, with this outspoken technique custom iPhone app development becomes possible without compromising the privacy of users. Now developers are not reluctant to offer personalized features and update machine learning model. Apple Create ML is a suite of machine learning products that use tools such as the Swift programming language and the macOS playground to create and train custom machine learning models on Mac computers. Create ML, developers need not write code for building machine learning models by Xcode and Swift.

Apple Watch

With the rhetorical introduction of Apple watch OS 6 and iOS app store on it, offered an opportunity for the developers to build apps that independently works on apple each without an iPhone. It ensured an opportunity for highly consolidated performance through undoubted decision to hire iPhone developer who enjoys the benefit of the Apple neural engine on series 4 apple watches using Core ML. Now users are able to stream their best-loved third-party media applications even with Apple watch due to new dedicated streaming audio API. While the applications are still in the foreground, additional time is given by an extended runtime API to accomplish tasks on their apple watch.

Fast, Easy and Private Sign in Using Apple ID

One additional and a foremost announcement during WWDC 2019 was about easy sign in to websites and app using their Apple ID. Now Instead of verifying an email address or choosing a password, filling out the information form, iOS users can use their Apple ID for setting up an account and use of an app with less user’s time engagement. Due to the unidentified security threats, a major challenge, mobile app development solutions embraced potential safeguards. So with no exception, all iOS users’ protection is assured through the two-factor authentication and anti-fraud which regenerates a great way to improve app’s .security. The users would receive necessary messages even without disclosing their email address through newly introduced privacy-focused email service.



Pro app developers react to the new Mac Pro and Pro Display XDR



Leading app developers for a variety of workflows, from video and photo editing to music production and advanced 3D content creation, have announced their support for the all-new Mac Pro and Pro Display XDR.

Adobe
“We’re incredibly excited about the new Mac Pro, which represents a strong commitment from Apple towards creatives working in 3D. We’ve already started porting the Substance line of tools, as well as Dimension, to Apple’s new graphic API Metal to fully take advantage of the immense power the new Mac Pro hardware offers and empower 3D creatives in unprecedented ways.” — Sebastien Deguy, vice president of 3D and Immersive, Adobe

“Apple continues to innovate for video professionals. With the power offered by the new Mac Pro, editors will be able to work with 8K without the need for any proxy workflows in a future release of Premiere Pro.” — Steven Warner, vice president of Digital Video and Audio, Adobe

“We can’t wait to leverage Apple’s new Pro Display XDR and to support its capabilities to the fullest in an upcoming release of Photoshop. For the first time, customers will be able to see and edit their Photoshop files in high dynamic range and their photos will come to life, revealing details not visible before.” — Maria Yap, vice president of Digital Imaging, Adobe

OTOY
“OTOY is incredibly excited about the all-new Mac Pro and how it will empower our users. Octane X — the 10th anniversary edition of Octane — has been rewritten from the ground up in Metal for Mac Pro, and is the culmination of a long and deep collaboration with Apple’s world-class engineering team. Mac Pro is like nothing we’ve seen before in a desktop system. Octane X will be leveraging this unprecedented performance to take interactive and production GPU rendering for film, TV, motion graphics and AR/VR to a whole new level. Octane X is truly a labor of love, and we can’t wait to get it into the hands of our Mac customers later this year.” —  Jules Urbach, CEO and founder, OTOY

Blackmagic Design
“DaVinci Resolve is the world’s most advanced color correction and online editing software for high-end film and television work. It was the first professional software to adopt Metal and now, with the new Mac Pro and Afterburner, we’re seeing full-quality 8K performance in real time with color correction and effects, something we could never dream of doing before. DaVinci Resolve running on the new Mac Pro is easily the fastest way to edit, grade and finish movies and TV shows.” — Grant Petty, CEO, Blackmagic Design

Maxon
“Tapping into the amazing performance of the new Mac Pro, we’re excited to develop Redshift for Metal, and we’re working with Apple to bring an optimized version to the Mac Pro for the first time by the end of the year. We’re also actively developing Metal support for Cinema 4D, which will provide our Mac users with accelerated workflows for the most complex content creation. The new Mac Pro graphics architecture is incredibly powerful and is the best system to run Cinema 4D.” — David McGavran, CEO, Maxon

Avid
“Avid’s Pro Tools team is blown away by the unprecedented processing power of the new Mac Pro, and thanks to its internal expansion capabilities, up to six Pro Tools HDX cards can be installed within the system – a first for Avid’s flagship audio workstation. We’re now able to deliver never-before-seen performance and capabilities for audio production in a single system and deliver a platform that professional users in music and post have been eagerly awaiting.” — Francois Quereuil, director of Product Management, Avid

Unity
“We’re so excited for Unity creators to tap into the incredible power of the all-new Mac Pro. Our powerful and accessible real-time technology, combined with Mac Pro’s massive CPU power and Metal-enabled high-end graphics performance, along with the gorgeous new Pro Display XDR, will give creators everything they need to create the next smash-hit game, augmented reality experience or award-winning animated feature.” — Ralph Hauwert, vice president of Platforms, Unity

Pixar
“We are thrilled to announce full Metal support in Hydra in an upcoming release of USD toward the end of the year. Together with this new release, the new Mac Pro will dramatically accelerate the most demanding 3D graphics workflows thanks to an excellent combination of memory, bandwidth and computational performance. This new machine clearly shows Apple is delivering on the needs of professionals at high-end production facilities like Pixar.” — Guido Quaroni, vice president of Software Research and Development, Pixar

Autodesk
“Autodesk is fully embracing the all-new Mac Pro and we are already working on optimized updates to AutoCAD, Maya, Fusion and Flame. This level of innovation, combined with next-generation graphics APIs, such as Metal, bring extremely high graphics performance and visual fidelity to our Design, Manufacturing and Creation products and enable us to bring greater value to our customers.” — Amy Bunszel, senior vice president, Autodesk Design and Creation Products

Red Digital Cinema
“Apple’s new hardware will bring a mind-blowing level of performance to Metal-accelerated, proxy-free R3D workflows in Final Cut Pro X that editors truly have never seen before. We are very excited to bring a Metal-optimized version of R3D in September.” — Jarred Land, president, Red Digital Cinema

Foundry
“With the all-new Mac Pro, Apple delivers incredible performance for media and entertainment professionals, and we can’t wait to see what our customers create with the immense power and flexibility that Mac Pro brings to artists. HDR is quickly becoming the standard for capturing and delivering high quality content, and the Pro Display XDR will enable Nuke and Nuke Studio artists to work closer to the final image on their desktop, improving their speed and giving them the freedom to focus on the quality of their work. We look forward to updating our products to take advantage of what Mac Pro offers.” — Jody Madden, chief product and customer officer, Foundry

Universal Audio
“The new Mac Pro is a breakthrough in recording and mixing performance. Thunderbolt 3 and the numerous PCIe slots for installing UAD plug-in co-processors pair perfectly with our Apollo X series of audio interfaces. Combined with the sheer processing power of the Mac Pro, our most demanding users will be able to track and mix the largest sessions effortlessly.” – Bill Putnam Jr., CEO, Universal Audio

Cine Tracer
“Thanks to the unbelievable power of the new Mac Pro, users of Cine Tracer will be able to work in 4K and higher resolution in real time when visualizing their projects. And with twice as many lights to work with in the same scene, combined with Unreal Engine’s real-time graphics technology, artists can now load scenes that were previously too large or graphically taxing.” — Matt Workman, developer, Cine Tracer

Pixelmator
“The new Mac Pro is insanely fast — it’s by far the fastest image editor we’ve ever experienced or seen. With the incredible Pro Display XDR, all-new photo editing workflows are now a reality. When editing RAW shots, users can choose to view extended dynamic detail in images, invisible on other displays, for a phenomenal viewing experience like we never imagined.” — Simonas Bastys, lead developer, Pixelmator

Serif
“Affinity Photo users demand the highest levels of performance, and the new, insanely powerful Mac Pro, coupled with the new discrete, multi-GPU support in Photo 1.7 allows our users to work in real time on massive, deep-color projects. Thanks to our extensive Metal adoption, every stage of the editing process is accelerated. And as Photo scales linearly with multiple GPUs, users will see up to four-time performance gains over the iMac Pro and 20 times over typical PC hardware. It’s the fastest system we’ve ever run on. Our Metal support also means incredible HDR support for the new Pro Display XDR.” — Ashley Hewson, managing director, Serif

SideFX
“With the new Mac Pro’s incredible compute performance and amazing graphics architecture, Houdini users will be able to work faster and more efficiently, unleashing a whole new level of creativity.” — Cristin Barghiel, vice president of Product Development, SideFX

Epic Games
“Epic’s Unreal Engine on the new Mac Pro takes advantage of its incredible graphics performance to deliver amazing visual quality, and will enable workflows that were never possible before on a Mac. We can’t wait to see how the new Mac Pro enhances our customers’ limitless creativity in cinematic production, visualization, games and more.” — Kim Libreri, CTO, Epic Games



MINDs Lab to Feature AI-based English Education App, Smart Factory, Smart City at CES Asia 2019


SHANGHAI, June 7, 2019 /PRNewswire/ — The Korean AI company MINDs Lab, a member company of the Born2Global Centre, will be showcasing its latest AI technologies and mobile applications at CES Asia 2019 (Hall N3 -3461).

MINDs Lab will be participating in CES Asia 2019 to introduce its AI platform “maum AI”.

MINDs Lab will be participating in CES Asia 2019, held in Shanghai from June 11 to 13, to introduce its AI platform, “maum AI,” as well as an English conversation-learning application and AI-based smart factory/city solutions. MINDs Lab is a major Korean AI company that has set the record for the highest sales in the domestic market for a single AI product. Currently, the company is engaged in projects in various areas: APIs (AI engines), smart factory/city solutions, AI-based English language education, and AI-based hybrid customer service centers.

MINDs Lab will be introducing a wide range of AI solutions at CES Asia 2019, including an AI-based English language education service and smart factory/city solutions. The AI English education service will be showcased via “mAI English,” an AI-based English conversation mobile application that was launched in Korea in May. The application is based on AI technology that is capable of assessing the user’s accuracy of pronunciation and linguistic articulation and enables the user to practice English speaking skills anywhere and at any time through conversations with an AI partner. The technology can be offered as a B2B solution, which is applicable not only to mobile applications like mAI English but also voice recognition technology, AI voice production technology, or even chatbot technology.

The AI smart factory and smart city solutions, which are combinations of IoT technology and the latest AI algorithms, are also worthy of note. MINDs Lab’s smart factory framework, “maum MAAL,” collects and analyzes data in real-time through an AI deep-learning algorithm, allowing it to increase the efficiency and productivity of the entire production process by determining optimum input levels, production amounts, temperatures, speeds, and other aspects of production. Since 2017 to this date, MINDs Lab successfully implemented a large-scale smart factory project with the global steel corporation POSCO.

MINDs Lab’s smart city project, which includes solutions for recognizing car license plate numbers, text, and car models and detecting cars through video footage, is being conducted with Seoul Metropolitan Government. The company is also working on a project with Suwon City to detect and analyze abnormal behavior via CCTV footage. A project with Daegu Metropolitan Government to create Korea’s first AI-based complaint consultation chatbot (“Ddubot”) resulted in the city of Daegu achieving the fastest complaint processing time among local governments in Korea.

A MINDs Lab spokesperson said, “Our AI platform ‘maum AI’ is a general-purpose AI platform that can be used with diverse AI algorithms, engines, and data. Through CES Asia, we hope to find Chinese partners with high potential.”

For more detailed information on MINDs Lab, visit https://www.mindslab.ai/en

Media Contact

MINDs Lab: [email protected]
Born2Global Centre: [email protected]

Photo – https://photos.prnasia.com/prnh/20190604/2486404-1



Apple unveils groundbreaking new technologies for app development



Xcode 11 Brings SwiftUI to Life

A new graphical UI design tool built into Xcode 11 makes it easy for UI designers to quickly assemble a user interface with SwiftUI — without having to write any code. Swift code is automatically generated and when this code is modified, the changes to the UI instantly appear in the visual design tool. Now developers can see automatic, real-time previews of how the UI will look and behave as they assemble, test and refine their code. The ability to fluidly move between graphical design and writing code makes UI development more fun and efficient and makes it possible for software developers and UI designers to collaborate more closely. Previews can run directly on connected Apple devices, including iPhone, iPad, iPod touch, Apple Watch and Apple TV, allowing developers to see how an app responds to Multi-Touch, or works with the camera and on-board sensors — live, as the interface is being built.

Augmented Reality

ARKit 3 puts people at the center of AR. With Motion Capture, developers can integrate people’s movement into their app, and with People Occlusion, AR content will show up naturally in front of or behind people to enable more immersive AR experiences and fun green screen-like applications. ARKit 3 also enables the front camera to track up to three faces, as well as simultaneous front and back camera support. It also enables collaborative sessions, which make it even faster to jump into a shared AR experience.