Tag Archives: react

Thumbnail

Building Mobile Apps Using React Native And WordPress




Building Mobile Apps Using React Native And WordPress

Muhammad Muhsin



As web developers, you might have thought that mobile app development calls for a fresh learning curve with another programming language. Perhaps Java and Swift need to be added to your skill set to hit the ground running with both iOS and Android, and that might bog you down.

But this article has you in for a surprise! We will look at building an e-commerce application for iOS and Android using the WooCommerce platform as our backend. This would be an ideal starting point for anyone willing to get into native cross-platform development.

A Brief History Of Cross-Platform Development

It’s 2011, and we see the beginning of hybrid mobile app development. Frameworks like Apache Cordova, PhoneGap, and Ionic Framework slowly emerge. Everything looks good, and web developers are eagerly coding away mobile apps with their existing knowledge.

However, mobile apps still looked like mobile versions of websites. No native designs like Android’s material design or iOS’s flat look. Navigation worked similar to the web and transitions were not buttery smooth. Users were not satisfied with apps built using the hybrid approach and dreamt of the native experience.

Fast forward to March 2015, and React Native appears on the scene. Developers are able to build truly native cross-platform applications using React, a favorite JavaScript library for many developers. They are now easily able to learn a small library on top of what they know with JavaScript. With this knowledge, developers are now targeting the web, iOS and Android.

Furthermore, changes done to the code during development are loaded onto the testing devices almost instantly! This used to take several minutes when we had native development through other approaches. Developers are able to enjoy the instant feedback they used to love with web development.

React developers are more than happy to be able to use existing patterns they have followed into a new platform altogether. In fact, they are targeting two more platforms with what they already know very well.

This is all good for front-end development. But what choices do we have for back-end technology? Do we still have to learn a new language or framework?

The WordPress REST API

In late 2016, WordPress released the much awaited REST API to its core, and opened the doors for solutions with decoupled backends.

So, if you already have a WordPress and WooCommerce website and wish to retain exactly the same offerings and user profiles across your website and native app, this article is for you!

Assumptions Made In This Article

I will walk you through using your WordPress skill to build a mobile app with a WooCommerce store using React Native. The article assumes:

  • You are familiar with the different WordPress APIs, at least at a beginner level.
  • You are familiar with the basics of React.
  • You have a WordPress development server ready. I use Ubuntu with Apache.
  • You have an Android or an iOS device to test with Expo.

What We Will Build In This Tutorial

The project we are going to build through this article is a fashion store app. The app will have the following functionalities:

  • Shop page listing all products,
  • Single product page with details of the selected item,
  • ‘Add to cart’ feature,
  • ‘Show items in cart’ feature,
  • ‘Remove item from cart’ feature.

This article aims to inspire you to use this project as a starting point to build complex mobile apps using React Native.

Note: For the full application, you can visit my project on Github and clone it.

Getting Started With Our Project

We will begin building the app as per the official React Native documentation. Having installed Node on your development environment, open up the command prompt and type in the following command to install the Create React Native App globally.

npm install -g create-react-native-app

Next, we can create our project

create-react-native-app react-native-woocommerce-store

This will create a new React Native project which we can test with Expo.

Next, we will need to install the Expo app on our mobile device which we want to test. It is available for both iOS and Android.

On having installed the Expo app, we can run npm start on our development machine.

cd react-native-woocommerce-store

npm start


Starting a React Native project through the command line via Expo. (Large preview)

After that, you can scan the QR code through the Expo app or enter the given URL in the app’s search bar. This will run the basic ‘Hello World’ app in the mobile. We can now edit App.js to make instant changes to the app running on the phone.

Alternatively, you can run the app on an emulator. But for brevity and accuracy, we will cover running it on an actual device.

Next, let’s install all the required packages for the app using this command:

npm install -s axios react-native-htmlview react-navigation react-redux redux redux-thunk

Setting Up A WordPress Site

Since this article is about creating a React Native app, we will not go into details about creating a WordPress site. Please refer to this article on how to install WordPress on Ubuntu. As WooCommerce REST API requires HTTPS, please make sure it is set up using Let’s Encrypt. Please refer to this article for a how-to guide.

We are not creating a WordPress installation on localhost since we will be running the app on a mobile device, and also since HTTPS is needed.

Once WordPress and HTTPS are successfully set up, we can install the WooCommerce plugin on the site.


Installing the WooCommerce plugin to our WordPress installation. (Large preview)

After installing and activating the plugin, continue with the WooCommerce store setup by following the wizard. After the wizard is complete, click on ‘Return to dashboard.’

You will be greeted by another prompt.


Adding example products to WooCommerce. (Large preview)

Click on ‘Let’s go‘ to ‘Add example products’. This will save us the time to create our own products to display in the app.

Constants File

To load our store’s products from the WooCommerce REST API, we need the relevant keys in place inside our app. For this purpose, we can have a constans.js file.

First create a folder called ‘src’ and create subfolders inside as follows:


Create the file ‘Constants.js’ within the constans folder. (Large preview)

Now, let’s generate the keys for WooCommerce. In the WordPress dashboard, navigate to WooCommerce → Settings → API → Keys/Apps and click on ‘Add Key.’

Next create a Read Only key with name React Native. Copy over the Consumer Key and Consumer Secret to the constants.js file as follows:

const Constants = 
   URL: 
wc: 'https://woocommerce-store.on-its-way.com/wp-json/wc/v2/'
   ,
   Keys: 
ConsumerKey: 'CONSUMER_KEY_HERE',
ConsumerSecret: 'CONSUMER_SECRET_HERE'
   
}
export default Constants;

Starting With React Navigation

React Navigation is a community solution to navigating between the different screens and is a standalone library. It allows developers to set up the screens of the React Native app with just a few lines of code.

There are different navigation methods within React Navigation:

  • Stack,
  • Switch,
  • Tabs,
  • Drawer,
  • and more.

For our Application we will use a combination of StackNavigation and DrawerNavigation to navigate between the different screens. StackNavigation is similar to how browser history works on the web. We are using this since it provides an interface for the header and the header navigation icons. It has push and pop similar to stacks in data structures. Push means we add a new screen to the top of the Navigation Stack. Pop removes a screen from the stack.

The code shows that the StackNavigation, in fact, houses the DrawerNavigation within itself. It also takes properties for the header style and header buttons. We are placing the navigation drawer button to the left and the shopping cart button to the right. The drawer button switches the drawer on and off whereas the cart button takes the user to the shopping cart screen.

const StackNavigation = StackNavigator(
 DrawerNavigation:  screen: DrawerNavigation 
}, 
   headerMode: 'float',
   navigationOptions: ( navigation, screenProps ) => (
     headerStyle:  backgroundColor: '#4C3E54' ,
     headerTintColor: 'white',
     headerLeft: drawerButton(navigation),
     headerRight: cartButton(navigation, screenProps)
   })
 });

const drawerButton = (navigation) => (
 <Text
   style= padding: 15, color: 'white' }
   onPress=() => 
     if (navigation.state.index === 0) 
       navigation.navigate('DrawerOpen')
      else 
       navigation.navigate('DrawerClose')
     
   }
   }> (
 <Text style= padding: 15, color: 'white' }
   onPress=() =>  navigation.navigate('CartPage') }
 >
   <EvilIcons name="cart" size=30 />
   screenProps.cartCount
 </Text>
);

DrawerNavigation on the other hands provides for the side drawer which will allow us to navigate between Home, Shop, and Cart. The DrawerNavigator lists the different screens that the user can visit, namely Home page, Products page, Product page, and Cart page. It also has a property which will take the Drawer container: the sliding menu which opens up when clicking the hamburger menu.

const DrawerNavigation = DrawerNavigator(
 Home: 
   screen: HomePage,
   navigationOptions: 
     title: "RN WC Store"
   
 },
 Products: 
   screen: Products,
   navigationOptions: 
     title: "Shop"
   
 },
 Product: 
   screen: Product,
   navigationOptions: ( navigation ) => (
     title: navigation.state.params.product.name
   ),
 },
 CartPage: 
   screen: CartPage,
   navigationOptions: 
     title: "Cart"
   
 }
}, 
   contentComponent: DrawerContainer
 );

#


Left: The Home page (homepage.js). Right: The open drawer (DrawerContainer.js).

Injecting The Redux Store To App.js

Since we are using Redux in this app, we have to inject the store into our app. We do this with the help of the Provider component.

const store = configureStore();

class App extends React.Component 
 render() 
   return (
     <Provider store=store>    
       <ConnectedApp />    
     </Provider>    
   )
 }
}

We will then have a ConnectedApp component so that we can have the cart count in the header.

class CA extends React.Component 
 render() 
   const cart = 
     cartCount: this.props.cart.length
   
   return (
     <StackNavigation screenProps=cart />
   );
 }
}

function mapStateToProps(state) 
 return 
   cart: state.cart
 ;
}

const ConnectedApp = connect(mapStateToProps, null)(CA);

Redux Store, Actions, And Reducers

In Redux, we have three different parts:

  1. Store
    Holds the whole state of your entire application. The only way to change state is to dispatch an action to it.
  2. Actions
    A plain object that represents an intention to change the state.
  3. Reducers
    A function that accepts a state and an action type and returns a new state.

These three components of Redux help us achieve a predictable state for the entire app. For simplicity, we will look at how the products are fetched and saved in the Redux store.

First of all, let’s look at the code for creating the store:

let middleware = [thunk];

export default function configureStore() 
    return createStore(
        RootReducer,
        applyMiddleware(...middleware)
    );

Next, the products action is responsible for fetching the products from the remote website.

export function getProducts() 
   return (dispatch) => 
       const url = `$Constants.URL.wcproducts?per_page=100&consumer_key=$Constants.Keys.ConsumerKey&consumer_secret=$Constants.Keys.ConsumerSecret`
      
       return axios.get(url).then(response => 
           dispatch(
               type: types.GET_PRODUCTS_SUCCESS,
               products: response.data
           
       )}).catch(err => 
           console.log(err.error);
       )
   };
}

The products reducer is responsible for returning the payload of data and whether it needs to be modified.

export default function (state = InitialState.products, action) 
    switch (action.type) 
        case types.GET_PRODUCTS_SUCCESS:
            return action.products;
        default:
            return state;
    
}

Displaying The WooCommerce Shop

The products.js file is our Shop page. It basically displays the list of products from WooCommerce.

class ProductsList extends Component 

 componentDidMount() 
   this.props.ProductAction.getProducts(); 
 

 _keyExtractor = (item, index) => item.id;

 render() 
   const  navigate  = this.props.navigation;
   const Items = (
     <FlatList contentContainerStyle=styles.list numColumns=2
       data=this.props.products  
       keyExtractor=this._keyExtractor
       renderItem=
         ( item ) => (
           <TouchableHighlight style= width: '50%' } onPress=() => navigate("Product",  product: item )} underlayColor="white">
             <View style=styles.view >
               <Image style=styles.image source= uri: item.images[0].src } />
               <Text style=styles.text>item.name</Text>
             </View>
           </TouchableHighlight>
         )
       }
     />
   );
   return (
     <ScrollView>
       this.props.products.length ? Items :
         <View style= alignItems: 'center', justifyContent: 'center' }>
           <Image style=styles.loader source=LoadingAnimation />
         </View>
       }
     </ScrollView>
   );
 }
}

this.props.ProductAction.getProducts() and this.props.products are possible because of mapStateToProps and mapDispatchToProps.


Products listing screen. (Large preview)

mapStateToProps and mapDispatchToProps

State is the Redux store and Dispatch is the actions we fire. Both of these will be exposed as props in the component.

function mapStateToProps(state) 
 return 
   products: state.products
 ;
}
function mapDispatchToProps(dispatch) 
 return 
   ProductAction: bindActionCreators(ProductAction, dispatch)
 ;
}
export default connect(mapStateToProps, mapDispatchToProps)(ProductsList);

Styles

In React, Native styles are generally defined on the same page. It’s similar to CSS, but we use camelCase properties instead of hyphenated properties.

const styles = StyleSheet.create(
 list: 
   flexDirection: 'column'
 ,
 view: 
   padding: 10
 ,
 loader: 
   width: 200,
   height: 200,
   alignItems: 'center',
   justifyContent: 'center',
 ,
 image: 
   width: 150,
   height: 150
 ,
 text: 
   textAlign: 'center',
   fontSize: 20,
   padding: 5
 
});

Single Product Page

This page contains details of a selected product. It shows the user the name, price, and description of the product. It also has the ‘Add to cart’ function.


Single product page. (Large preview)

Cart Page

This screen shows the list of items in the cart. The action has the functions getCart, addToCart, and removeFromCart. The reducer handles the actions likewise. Identification of actions is done through actionTypes — constants which describe the action that are stored in a separate file.

export const GET_PRODUCTS_SUCCESS = 'GET_PRODUCTS_SUCCESS'
export const GET_PRODUCTS_FAILED = 'GET_PRODUCTS_FAILED';

export const GET_CART_SUCCESS = 'GET_CART_SUCCESS';
export const ADD_TO_CART_SUCCESS = 'ADD_TO_CART_SUCCESS';
export const REMOVE_FROM_CART_SUCCESS = 'REMOVE_FROM_CART_SUCCESS';

This is the code for the CartPage component:

class CartPage extends React.Component 

 componentDidMount() 
   this.props.CartAction.getCart();
 

 _keyExtractor = (item, index) => item.id;

 removeItem(item) 
   this.props.CartAction.removeFromCart(item);
 

 render() 
   const  cart  = this.props;
   console.log('render cart', cart)

   if (cart && cart.length > 0) {
     const Items = <FlatList contentContainerStyle=styles.list
       data=cart
       keyExtractor=this._keyExtractor
       renderItem=( item ) =>
         <View style=styles.lineItem >
           <Image style=styles.image source= uri: item.image } />
           <Text style=styles.text>item.name</Text>
           <Text style=styles.text>item.quantity</Text>
           <TouchableOpacity style= marginLeft: 'auto' } onPress=() => this.removeItem(item)><Entypo name="cross" size=30 /></TouchableOpacity>
         </View>
       }
     />;
     return (
       <View style=styles.container>
         Items
       </View>
     )
   } else {
     return (
       <View style=styles.container>
         <Text>Cart is empty!</Text>
       </View>
     )
   }
 }
}

As you can see, we are using a FlatList to iterate through the cart items. It takes in an array and creates a list of items to be displayed on the screen.


#


Left: The cart page when it has items in it. Right: The cart page when it is empty.

Conclusion

You can configure information about the app such as name and icon in the app.json file. The app can be published after npm installing exp.

To sum up:

  • We now have a decent e-commerce application with React Native;
  • Expo can be used to run the project on a smartphone;
  • Existing backend technologies such as WordPress can be used;
  • Redux can be used for managing the state of the entire app;
  • Web developers, especially React developers can leverage this knowledge to build bigger apps.

For the full application, you can visit my project on Github and clone it. Feel free to fork it and improve it further. As an exercise, you can continue building more features into the project such as:

  • Checkout page,
  • Authentication,
  • Storing the cart data in AsyncStorage so that closing the app does not clear the cart.
Smashing Editorial
(da, lf, ra, yk, il)


More here: 

Building Mobile Apps Using React Native And WordPress

Thumbnail

How To Create An Audio/Video Recording App With React Native: An In-Depth Tutorial




How To Create An Audio/Video Recording App With React Native: An In-Depth Tutorial

Oleh Mryhlod



React Native is a young technology, already gaining popularity among developers. It is a great option for smooth, fast, and efficient mobile app development. High-performance rates for mobile environments, code reuse, and a strong community: These are just some of the benefits React Native provides.

In this guide, I will share some insights about the high-level capabilities of React Native and the products you can develop with it in a short period of time.

We will delve into the step-by-step process of creating a video/audio recording app with React Native and Expo. Expo is an open-source toolchain built around React Native for developing iOS and Android projects with React and JavaScript. It provides a bunch of native APIs maintained by native developers and the open-source community.

After reading this article, you should have all the necessary knowledge to create video/audio recording functionality with React Native.

Let’s get right to it.

Brief Description Of The Application

The application you will learn to develop is called a multimedia notebook. I have implemented part of this functionality in an online job board application for the film industry. The main goal of this mobile app is to connect people who work in the film industry with employers. They can create a profile, add a video or audio introduction, and apply for jobs.

The application consists of three main screens that you can switch between with the help of a tab navigator:

  • the audio recording screen,
  • the video recording screen,
  • a screen with a list of all recorded media and functionality to play back or delete them.

Check out how this app works by opening this link with Expo.

First, download Expo to your mobile phone. There are two options to open the project :

  1. Open the link in the browser, scan the QR code with your mobile phone, and wait for the project to load.
  2. Open the link with your mobile phone and click on “Open project using Expo”.

You can also open the app in the browser. Click on “Open project in the browser”. If you have a paid account on Appetize.io, visit it and enter the code in the field to open the project. If you don’t have an account, click on “Open project” and wait in an account-level queue to open the project.

However, I recommend that you download the Expo app and open this project on your mobile phone to check out all of the features of the video and audio recording app.

You can find the full code for the media recording app in the repository on GitHub.

Dependencies Used For App Development

As mentioned, the media recording app is developed with React Native and Expo.

You can see the full list of dependencies in the repository’s package.json file.

These are the main libraries used:

  • React-navigation, for navigating the application,
  • Redux, for saving the application’s state,
  • React-redux, which are React bindings for Redux,
  • Recompose, for writing the components’ logic,
  • Reselect, for extracting the state fragments from Redux.

Let’s look at the project’s structure:


Large preview

  • src/index.js: root app component imported in the app.js file;
  • src/components: reusable components;
  • src/constants: global constants;
  • src/styles: global styles, colors, fonts sizes and dimensions.
  • src/utils: useful utilities and recompose enhancers;
  • src/screens: screens components;
  • src/store: Redux store;
  • src/navigation: application’s navigator;
  • src/modules: Redux modules divided by entities as modules/audio, modules/video, modules/navigation.

Let’s proceed to the practical part.

Create Audio Recording Functionality With React Native

First, it’s important to сheck the documentation for the Expo Audio API, related to audio recording and playback. You can see all of the code in the repository. I recommend opening the code as you read this article to better understand the process.

When launching the application for the first time, you’ll need the user’s permission for audio recording, which entails access to the microphone. Let’s use Expo.AppLoading and ask permission for recording by using Expo.Permissions (see the src/index.js) during startAsync.

Await Permissions.askAsync(Permissions.AUDIO_RECORDING);

Audio recordings are displayed on a seperate screen whose UI changes depending on the state.

First, you can see the button “Start recording”. After it is clicked, the audio recording begins, and you will find the current audio duration on the screen. After stopping the recording, you will have to type the recording’s name and save the audio to the Redux store.

My audio recording UI looks like this:


Large preview

I can save the audio in the Redux store in the following format:

audioItemsIds: [‘id1’, ‘id2’],
audioItems: 
 ‘id1’: 
    id: string,
    title: string,
    recordDate: date string,
    duration: number,
    audioUrl: string,
 
},

Let’s write the audio logic by using Recompose in the screen’s container src/screens/RecordAudioScreenContainer.

Before you start recording, customize the audio mode with the help of Expo.Audio.set.AudioModeAsync (mode), where mode is the dictionary with the following key-value pairs:

  • playsInSilentModeIOS: A boolean selecting whether your experience’s audio should play in silent mode on iOS. This value defaults to false.
  • allowsRecordingIOS: A boolean selecting whether recording is enabled on iOS. This value defaults to false. Note: When this flag is set to true, playback may be routed to the phone receiver, instead of to the speaker.
  • interruptionModeIOS: An enum selecting how your experience’s audio should interact with the audio from other apps on iOS.
  • shouldDuckAndroid: A boolean selecting whether your experience’s audio should automatically be lowered in volume (“duck”) if audio from another app interrupts your experience. This value defaults to true. If false, audio from other apps will pause your audio.
  • interruptionModeAndroid: An enum selecting how your experience’s audio should interact with the audio from other apps on Android.

Note: You can learn more about the customization of AudioMode in the documentation.

I have used the following values in this app:

interruptionModeIOS: Audio.INTERRUPTION_MODE_IOS_DO_NOT_MIX, — Our record interrupts audio from other apps on IOS.

playsInSilentModeIOS: true,

shouldDuckAndroid: true,

interruptionModeAndroid: Audio.INTERRUPTION_MODE_ANDROID_DO_NOT_MIX — Our record interrupts audio from other apps on Android.

allowsRecordingIOS Will change to true before the audio recording and to false after its completion.

To implement this, let’s write the handler setAudioMode with Recompose.

withHandlers(
 setAudioMode: () => async ( allowsRecordingIOS ) => 
   try 
     await Audio.setAudioModeAsync(
       allowsRecordingIOS,
       interruptionModeIOS: Audio.INTERRUPTION_MODE_IOS_DO_NOT_MIX,
       playsInSilentModeIOS: true,
       shouldDuckAndroid: true,
       interruptionModeAndroid: Audio.INTERRUPTION_MODE_ANDROID_DO_NOT_MIX,
     );
   } catch (error) 
     console.log(error) // eslint-disable-line
   
 },
}),

To record the audio, you’ll need to create an instance of the Expo.Audio.Recording class.

const recording = new Audio.Recording();

After creating the recording instance, you will be able to receive the status of the Recording with the help of recordingInstance.getStatusAsync().

The status of the recording is a dictionary with the following key-value pairs:

  • canRecord: a boolean.
  • isRecording: a boolean describing whether the recording is currently recording.
  • isDoneRecording: a boolean.
  • durationMillis: current duration of the recorded audio.

You can also set a function to be called at regular intervals with recordingInstance.setOnRecordingStatusUpdate(onRecordingStatusUpdate).

To update the UI, you will need to call setOnRecordingStatusUpdate and set your own callback.

Let’s add some props and a recording callback to the container.

withStateHandlers(
    recording: null,
    isRecording: false,
    durationMillis: 0,
    isDoneRecording: false,
    fileUrl: null,
    audioName: '',
  , 
    setState: () => obj => obj,
    setAudioName: () => audioName => ( audioName ),
   recordingCallback: () => ( durationMillis, isRecording, isDoneRecording ) =>
      ( durationMillis, isRecording, isDoneRecording ),
  }),

The callback setting for setOnRecordingStatusUpdate is:

recording.setOnRecordingStatusUpdate(props.recordingCallback);

onRecordingStatusUpdate is called every 500 milliseconds by default. To make the UI update valid, set the 200 milliseconds interval with the help of setProgressUpdateInterval:

recording.setProgressUpdateInterval(200);

After creating an instance of this class, call prepareToRecordAsync to record the audio.

recordingInstance.prepareToRecordAsync(options) loads the recorder into memory and prepares it for recording. It must be called before calling startAsync(). This method can be used if the recording instance has never been prepared.

The parameters of this method include such options for the recording as sample rate, bitrate, channels, format, encoder and extension. You can find a list of all recording options in this document.

In this case, let’s use Audio.RECORDING_OPTIONS_PRESET_HIGH_QUALITY.

After the recording has been prepared, you can start recording by calling the method recordingInstance.startAsync().

Before creating a new recording instance, check whether it has been created before. The handler for beginning the recording looks like this:

onStartRecording: props => async () => 
      try 
        if (props.recording) 
          props.recording.setOnRecordingStatusUpdate(null);
          props.setState( recording: null );
        }

        await props.setAudioMode( allowsRecordingIOS: true );

        const recording = new Audio.Recording();
        recording.setOnRecordingStatusUpdate(props.recordingCallback);
        recording.setProgressUpdateInterval(200);

        props.setState( fileUrl: null );

await recording.prepareToRecordAsync(Audio.RECORDING_OPTIONS_PRESET_HIGH_QUALITY);
        await recording.startAsync();

        props.setState( recording );
      } catch (error) 
        console.log(error) // eslint-disable-line
      
    },

Now you need to write a handler for the audio recording completion. After clicking the stop button, you have to stop the recording, disable it on iOS, receive and save the local URL of the recording, and set OnRecordingStatusUpdate and the recording instance to null:

onEndRecording: props => async () => 
      try 
        await props.recording.stopAndUnloadAsync();
        await props.setAudioMode( allowsRecordingIOS: false );
      } catch (error) 
        console.log(error); // eslint-disable-line
      

      if (props.recording) 
        const fileUrl = props.recording.getURI();
        props.recording.setOnRecordingStatusUpdate(null);
        props.setState( recording: null, fileUrl );
      }
    },

After this, type the audio name, click the “continue” button, and the audio note will be saved in the Redux store.

onSubmit: props => () => 
      if (props.audioName && props.fileUrl) 
        const audioItem = 
          id: uuid(),
          recordDate: moment().format(),
          title: props.audioName,
          audioUrl: props.fileUrl,
          duration: props.durationMillis,
        ;

        props.addAudio(audioItem);
        props.setState(
          audioName: '',
          isDoneRecording: false,
        );

        props.navigation.navigate(screens.LibraryTab);
      }
    },
(Large preview)

Audio Playback With React Native

You can play the audio on the screen with the saved audio notes. To start the audio playback, click one of the items on the list. Below, you can see the audio player that allows you to track the current position of playback, to set the playback starting point and to toggle the playing audio.

Here’s what my audio playback UI looks like:


Large preview

The Expo.Audio.Sound objects and Expo.Video components share a unified imperative API for media playback.

Let’s write the logic of the audio playback by using Recompose in the screen container src/screens/LibraryScreen/LibraryScreenContainer, as the audio player is available only on this screen.

If you want to display the player at any point of the application, I recommend writing the logic of the player and audio playback in Redux operations using redux-thunk.

Let’s customize the audio mode in the same way we did for the audio recording. First, set allowsRecordingIOS to false.

lifecycle(
    async componentDidMount() 
      await Audio.setAudioModeAsync(
        allowsRecordingIOS: false,
        interruptionModeIOS: Audio.INTERRUPTION_MODE_IOS_DO_NOT_MIX,
        playsInSilentModeIOS: true,
        shouldDuckAndroid: true,
        interruptionModeAndroid: Audio.INTERRUPTION_MODE_ANDROID_DO_NOT_MIX,
      );
    },
  }),

We have created the recording instance for audio recording. As for audio playback, we need to create the sound instance. We can do it in two different ways:

  1. const playbackObject = new Expo.Audio.Sound();
  2. Expo.Audio.Sound.create(source, initialStatus = {}, onPlaybackStatusUpdate = null, downloadFirst = true)

If you use the first method, you will need to call playbackObject.loadAsync(), which loads the media from source into memory and prepares it for playing, after creation of the instance.

The second method is a static convenience method to construct and load a sound. It сreates and loads a sound from source with the optional initialStatus, onPlaybackStatusUpdate and downloadFirst parameters.

The source parameter is the source of the sound. It supports the following forms:

  • a dictionary of the form uri: 'http://path/to/file' with a network URL pointing to an audio file on the web;
  • require('path/to/file') for an audio file asset in the source code directory;
  • an Expo.Asset object for an audio file asset.

The initialStatus parameter is the initial playback status. PlaybackStatus is the structure returned from all playback API calls describing the state of the playbackObject at that point of time. It is a dictionary with the key-value pairs. You can check all of the keys of the PlaybackStatus in the documentation.

onPlaybackStatusUpdate is a function taking a single parameter, PlaybackStatus. It is called at regular intervals while the media is in the loaded state. The interval is 500 milliseconds by default. In my application, I set it to 50 milliseconds interval for a proper UI update.

Before creating the sound instance, you will need to implement the onPlaybackStatusUpdate callback. First, add some props to the screen container:

withClassVariableHandlers(
    playbackInstance: null,
    isSeeking: false,
    shouldPlayAtEndOfSeek: false,
    playingAudio: null,
  , 'setClassVariable'),
  withStateHandlers(
    position: null,
    duration: null,
    shouldPlay: false,
    isLoading: true,
    isPlaying: false,
    isBuffering: false,
    showPlayer: false,
  , 
    setState: () => obj => obj,
  ),

Now, implement onPlaybackStatusUpdate. You will need to make several validations based on PlaybackStatus for a proper UI display:

withHandlers(
    soundCallback: props => (status) => 
      if (status.didJustFinish) 
        props.playbackInstance().stopAsync();
       else if (status.isLoaded) 
        const position = props.isSeeking()
          ? props.position
          : status.positionMillis;
        const isPlaying = (props.isSeeking() );
      }
    },
  }),

After this, you have to implement a handler for the audio playback. If a sound instance is already created, you need to unload the media from memory by calling playbackInstance.unloadAsync() and clear OnPlaybackStatusUpdate:

loadPlaybackInstance: props => async (shouldPlay) => 
      props.setState( isLoading: true );

      if (props.playbackInstance() !== null) 
        await props.playbackInstance().unloadAsync();
        props.playbackInstance().setOnPlaybackStatusUpdate(null);
        props.setClassVariable( playbackInstance: null );
      }
      const  sound  = await Audio.Sound.create(
         uri: props.playingAudio().audioUrl ,
         shouldPlay, position: 0, duration: 1, progressUpdateIntervalMillis: 50 ,
        props.soundCallback,
      );

      props.setClassVariable( playbackInstance: sound );

      props.setState( isLoading: false );
    },

Call the handler loadPlaybackInstance(true) by clicking the item in the list. It will automatically load and play the audio.

Let’s add the pause and play functionality (toggle playing) to the audio player. If audio is already playing, you can pause it with the help of playbackInstance.pauseAsync(). If audio is paused, you can resume playback from the paused point with the help of the playbackInstance.playAsync() method:

onTogglePlaying: props => () => 
      if (props.playbackInstance() !== null) 
        if (props.isPlaying) 
          props.playbackInstance().pauseAsync();
         else 
          props.playbackInstance().playAsync();
        
      }
    },

When you click on the playing item, it should stop. If you want to stop audio playback and put it into the 0 playing position, you can use the method playbackInstance.stopAsync():

onStop: props => () => 
      if (props.playbackInstance() !== null) 
        props.playbackInstance().stopAsync();

        props.setShowPlayer(false);
        props.setClassVariable( playingAudio: null );
      }
    },

The audio player also allows you to rewind the audio with the help of the slider. When you start sliding, the audio playback should be paused with playbackInstance.pauseAsync().

After the sliding is complete, you can set the audio playing position with the help of playbackInstance.setPositionAsync(value), or play back the audio from the set position with playbackInstance.playFromPositionAsync(value):

onCompleteSliding: props => async (value) => 
      if (props.playbackInstance() !== null) 
        if (props.shouldPlayAtEndOfSeek) 
          await props.playbackInstance().playFromPositionAsync(value);
         else 
          await props.playbackInstance().setPositionAsync(value);
        
        props.setClassVariable( isSeeking: false );
      }
    },

After this, you can pass the props to the components MediaList and AudioPlayer (see the file src/screens/LibraryScreen/LibraryScreenView).

Video Recording Functionality With React Native

Let’s proceed to video recording.

We’ll use Expo.Camera for this purpose. Expo.Camera is a React component that renders a preview of the device’s front or back camera. Expo.Camera can also take photos and record videos that are saved to the app’s cache.

To record video, you need permission for access to the camera and microphone. Let’s add the request for camera access as we did with the audio recording (in the file src/index.js):

await Permissions.askAsync(Permissions.CAMERA);

Video recording is available on the “Video Recording” screen. After switching to this screen, the camera will turn on.

You can change the camera type (front or back) and start video recording. During recording, you can see its general duration and can cancel or stop it. When recording is finished, you will have to type the name of the video, after which it will be saved in the Redux store.

Here is what my video recording UI looks like:


Large preview

Let’s write the video recording logic by using Recompose on the container screen

src/screens/RecordVideoScreen/RecordVideoScreenContainer.

You can see the full list of all props in the Expo.Camera component in the document.

In this application, we will use the following props for Expo.Camera.

  • type: The camera type is set (front or back).
  • onCameraReady: This callback is invoked when the camera preview is set. You won’t be able to start recording if the camera is not ready.
  • style: This sets the styles for the camera container. In this case, the size is 4:3.
  • ref: This is used for direct access to the camera component.

Let’s add the variable for saving the type and handler for its changing.

cameraType: Camera.Constants.Type.back,
toggleCameraType: state => () => (
      cameraType: state.cameraType === Camera.Constants.Type.front
        ? Camera.Constants.Type.back
        : Camera.Constants.Type.front,
    ),

Let’s add the variable for saving the camera ready state and callback for onCameraReady.

isCameraReady: false,

setCameraReady: () => () => ( isCameraReady: true ),

Let’s add the variable for saving the camera component reference and setter.

cameraRef: null,

setCameraRef: () => cameraRef => ( cameraRef ),

Let’s pass these variables and handlers to the camera component.

<Camera
          type=cameraType
          onCameraReady=setCameraReady
          style=s.camera
          ref=setCameraRef
        />

Now, when calling toggleCameraType after clicking the button, the camera will switch from the front to the back.

Currently, we have access to the camera component via the reference, and we can start video recording with the help of cameraRef.recordAsync().

The method recordAsync starts recording a video to be saved to the cache directory.

Arguments:

Options (object) — a map of options:

  • quality (VideoQuality): Specify the quality of recorded video. Usage: Camera.Constants.VideoQuality[‘‘], possible values: for 16:9 resolution 2160p, 1080p, 720p, 480p (Android only) and for 4:3 (the size is 640×480). If the chosen quality is not available for the device, choose the highest one.
  • maxDuration (number): Maximum video duration in seconds.
  • maxFileSize (number): Maximum video file size in bytes.
  • mute (boolean): If present, video will be recorded with no sound.

recordAsync returns a promise that resolves to an object containing the video file’s URI property. You will need to save the file’s URI in order to play back the video hereafter. The promise is returned if stopRecording was invoked, one of maxDuration and maxFileSize is reached or the camera preview is stopped.

Because the ratio set for the camera component sides is 4:3, let’s set the same format for the video quality.

Here is what the handler for starting video recording looks like (see the full code of the container in the repository):

onStartRecording: props => async () => 
      if (props.isCameraReady) 
        props.setState( isRecording: true, fileUrl: null );
        props.setVideoDuration();
        props.cameraRef.recordAsync( quality: '4:3' )
          .then((file) => 
            props.setState( fileUrl: file.uri );
          });
      }
    },

During the video recording, we can’t receive the recording status as we have done for audio. That’s why I have created a function to set video duration.

To stop the video recording, we have to call the following function:

stopRecording: props => () => 
      if (props.isRecording) 
        props.cameraRef.stopRecording();
        props.setState( isRecording: false );
        clearInterval(props.interval);
      }
    },

Check out the entire process of video recording.

Video Playback Functionality With React Native

You can play back the video on the “Library” screen. Video notes are located in the “Video” tab.

To start the video playback, click the selected item in the list. Then, switch to the playback screen, where you can watch or delete the video.

The UI for video playback looks like this:


Large preview

To play back the video, use Expo.Video, a component that displays a video inline with the other React Native UI elements in your app.

The video will be displayed on the separate screen, PlayVideo.

You can check out all of the props for Expo.Video here.

In our application, the Expo.Video component uses native playback controls and looks like this:

<Video
        source= uri: videoUrl }
        style=s.video
        shouldPlay=isPlaying
        resizeMode="contain"
        useNativeControls=isPlaying
        onLoad=onLoad
        onError=onError
      />
  • source

    This is the source of the video data to display. The same forms as for Expo.Audio.Sound are supported.

  • resizeMode

    This is a string describing how the video should be scaled for display in the component view’s bounds. It can be “stretch”, “contain” or “cover”.

  • shouldPlay

    This boolean describes whether the media is supposed to play.

  • useNativeControls

    This boolean, if set to true, displays native playback controls (such as play and pause) within the video component.

  • onLoad

    This function is called once the video has been loaded.

  • onError

    This function is called if loading or playback has encountered a fatal error. The function passes a single error message string as a parameter.

When the video is uploaded, the play button should be rendered on top of it.

When you click the play button, the video turns on and the native playback controls are displayed.

Let’s write the logic of the video using Recompose in the screen container src/screens/PlayVideoScreen/PlayVideoScreenContainer:

const defaultState = 
  isError: false,
  isLoading: false,
  isPlaying: false,
;

const enhance = compose(
  paramsToProps('videoUrl'),
  withStateHandlers(
    ...defaultState,
    isLoading: true,
  , 
    onError: () => () => ( ...defaultState, isError: true ),
    onLoad: () => () => defaultState,
   onTogglePlaying: ( isPlaying ) => () => ( ...defaultState, isPlaying: !isPlaying ),
  }),
);

As previously mentioned, the Expo.Audio.Sound objects and Expo.Video components share a unified imperative API for media playback. That’s why you can create custom controls and use more advanced functionality with the Playback API.

Check out the video playback process:

See the full code for the application in the repository.

You can also install the app on your phone by using Expo and check out how it works in practice.

Wrapping Up

I hope you have enjoyed this article and have enriched your knowledge of React Native. You can use this audio and video recording tutorial to create your own custom-designed media player. You can also scale the functionality and add the ability to save media in the phone’s memory or on a server, synchronize media data between different devices, and share media with others.

As you can see, there is a wide scope for imagination. If you have any questions about the process of developing an audio or video recording app with React Native, feel free to drop a comment below.

Smashing Editorial
(da, lf, ra, yk, al, il)


Original article: 

How To Create An Audio/Video Recording App With React Native: An In-Depth Tutorial

The Rise Of The State Machines

It’s 2018 already, and countless front-end developers are still leading a battle against complexity and immobility. Month after month, they’ve searched for the holy grail: a bug-free application architecture that will help them deliver quickly and with high quality. I am one of those developers, and I’ve found something interesting that might help.
We have taken a good step forward with tools such as React and Redux. However, they’re not enough on their own in large-scale applications.

View the original here – 

The Rise Of The State Machines

Web Development Reading List #187: Webpack 3, Assisted Writing, And Automated Chrome Testing

This week, we’ll explore some rather new concepts: What happens if we apply artificial intelligence to text software, for example? And why would a phone manufacturer want its business model to be stolen by competitors? We’ll also take a look at how we can use the new headless Chrome browser for automated testing and learn to build smarter JavaScript bundles with Webpack 3’s new scope hoisting. Sometimes it’s easy to be excited about all the improvements and new things our industry has to offer.

See the article here: 

Web Development Reading List #187: Webpack 3, Assisted Writing, And Automated Chrome Testing

How To Create Native Cross-Platform Apps With Fuse

Fuse is a toolkit for creating apps that run on both iOS and Android devices. It enables you to create apps using UX Markup, an XML-based language. But unlike the components in React Native and NativeScript, Fuse is not only used to describe the UI and layout; you can also use it to add effects and animation.

How To Create Native Cross-Platform Apps With Fuse

Styles are described by adding attributes such as Color and Margin to the various elements. Business logic is written using JavaScript. Later on, we’ll see how all of these components are combined to build a truly native app.

The post How To Create Native Cross-Platform Apps With Fuse appeared first on Smashing Magazine.

Link to article:

How To Create Native Cross-Platform Apps With Fuse

Internationalizing React Apps

First of all, let’s define some vocabulary. “Internationalization” is a long word, and there are at least three widely used abbreviations: “intl,” “i18n” and “l10n.” All of them mean the same thing.

Internationalizing React Apps

Internationalization can be generally broken down into three main challenges: Detecting the user’s locale, translating UI elements, titles as well as hints, and last but not least, serving locale-specific content such as dates, currencies and numbers. In this article, I am going to focus only on front-end part. We’ll develop a simple universal React application with full internationalization support.

The post Internationalizing React Apps appeared first on Smashing Magazine.

View original post here:

Internationalizing React Apps

Styled-Components: Enforcing Best Practices In Component-Based Systems

Building user interfaces on the web is hard, because the web and, thus, CSS were inherently made for documents. Some smart developers invented methodologies and conventions such as BEM, ITCSS, SMACSS and many more, which make building user interfaces easier and more maintainable by working with components.

Styled-Components: Enforcing Best Practices In Component-Based Systems

After this shift in mindset towards building component-based user interfaces, we are now in what we like to call the “component age.” The rise of JavaScript frameworks such as React, Ember and recently Angular 2, the effort of the W3C to standardize a web-native component system, pattern libraries and style guides being considered the “right way” to build web applications, and many other things have illuminated this revolution.

The post Styled-Components: Enforcing Best Practices In Component-Based Systems appeared first on Smashing Magazine.

Read More: 

Styled-Components: Enforcing Best Practices In Component-Based Systems

Web Development Reading List #165: Starting The New Year With Browser News, Container Architecture, And React “Aha” Moments

Happy new year! I hope you had a good start and can feel positive about what 2017 might bring. As mentioned in the last edition of the past year, I don’t like New Year’s resolutions too much, but I’d like to point you to something that Marc Thiele wishes for this year:
“So my wish then also is, that you reflect and ask yourself, if you want to post the text or maybe even just have another, a second look on the text you are about to post.

Visit site: 

Web Development Reading List #165: Starting The New Year With Browser News, Container Architecture, And React “Aha” Moments

Thumbnail

Building Hybrid Apps With ChakraCore

There are many reasons why one may want to embed JavaScript capabilities into an app. One example may be to take a dependency on a JavaScript library that has not yet been ported to the language you’re developing in. Another may be that you want to allow users to “eval” small routines or functions in JavaScript, e.g., in data processing applications.

Building Hybrid Apps with ChakraCore

The key reason for our investigation of ChakraCore was to support the React Native framework on the Universal Windows Platform, which is a framework for declaring applications using JavaScript and the React programming model.

The post Building Hybrid Apps With ChakraCore appeared first on Smashing Magazine.

View article – 

Building Hybrid Apps With ChakraCore

Thumbnail

How To Scale React Applications

We recently released version 3 of React Boilerplate, one of the most popular React starter kits, after several months of work. The team spoke with hundreds of developers about how they build and scale their web applications, and I want to share some things we learned along the way.

How To Scale React Applications

We realized early on in the process that we didn’t want it to be “just another boilerplate.” We wanted to give developers who were starting a company or building a product the best foundation to start from and to scale.

The post How To Scale React Applications appeared first on Smashing Magazine.

Read the article:

How To Scale React Applications