Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

How-To Tutorials - Android Programming

62 Articles
article-image-drawing-and-drawables-android-canvas
Packt
21 Nov 2013
8 min read
Save for later

Drawing and Drawables in Android Canvas

Packt
21 Nov 2013
8 min read
In this article by Mir Nauman Tahir, the author of the book Learning Android Canvas, our goal is to learn about the following: Drawing on a Canvas Drawing on a View Drawing on a SurfaceView Drawables Drawables from resource images Drawables from resource XML Shape Drawables (For more resources related to this topic, see here.) Android provides us with 2D drawing APIs that enable us to draw our custom drawing on the Canvas. When working with 2D drawings, we will either draw on view or directly on the surface or Canvas. Using View for our graphics, the drawing is handled by the system's normal View hierarchy drawing process. We only define our graphics to be inserted in the View; the rest is done automatically by the system. While using the method to draw directly on the Canvas, we have to manually call the suitable drawing Canvas methods such as onDraw() or createBitmap(). This method requires more efforts and coding and is a bit more complicated, but we have everything in control such as the animation and everything else like being in control of the size and location of the drawing and the colors and the ability to move the drawing from its current location to another location through code. The implementation of the onDraw() method can be seen in the drawing on the view section and the code for createBitmap() is shown in the Drawing on a Canvas section. We will use the drawing on the View method if we are dealing with static graphics–static graphics do not change dynamically during the execution of the application–or if we are dealing with graphics that are not resource hungry as we don't wish to put our application performance at stake. Drawing on a View can be used for designing eye-catching simple applications with static graphics and simple functionality–simple attractive backgrounds and buttons. It's perfectly okay to draw on View using the main UI thread as these graphics are not a threat to the overall performance of our application. The drawing on a Canvas method should be used when working with heavy graphics that change dynamically like those in games. In this scenario, the Canvas will continuously redraw itself to keep the graphics updated. We can draw on a Canvas using the main UI thread, but when working with heavy, resource-hungry, dynamically changing graphics, the application will continuously redraw itself. It is better to use a separate thread to draw these graphics. Keeping such graphics on the main UI thread will not make them go into the non-responding mode, and after working so hard we certainly won't like this. So this choice should be made very carefully. Drawing on a Canvas A Canvas is an interface, a medium that enables us to actually access the surface, which we will use to draw our graphics. The Canvas contains all the necessary drawing methods needed to draw our graphics. The actual internal mechanism of drawing on a Canvas is that, whenever anything needs to be drawn on the Canvas, it's actually drawn on an underlying blank bitmap image. By default, this bitmap is automatically provided for us. But if we want to use a new Canvas, then we need to create a new bitmap image and then a new Canvas object while providing the already created bitmap to the constructor of the Canvas class. A sample code is explained as follows. Initially, the bitmap is drawn but not on the screen; it's actually drawn in the background on an internal Canvas. But to bring it to the front, we need to create a new Canvas object and provide the already created bitmap to it to be painted on the screen. Bitmap ourNewBitmap = Bitmap.CreateBitmap(100,100,Bitmap.Config.ARGB_8888); Canvas ourNewCanvas = new Canvas(ourNewBitmap); Drawing on a View If our application does not require heavy system resources or fast frame rates, we should use View.onDraw(). The benefit in this case is that the system will automatically give the Canvas its underlying bitmap as well. All we need is to make our drawing calls and be done with our drawings. We will create our class by extending it from the View class and will define the onDraw() method in it. The onDraw() method is where we will define whatever we want to draw on our Canvas. The Android framework will call the onDraw() method to ask our View to draw itself. The onDraw() method will be called by the Android framework on a need basis; for example, whenever our application wants to draw itself, this method will be called. We have to call the invalidate() method whenever we want our view to redraw itself. This means that, whenever we want our application's view to be redrawn, we will call the invalidate() method and the Android framework will call the onDraw() method for us. Let's say we want to draw a line, then the code would be something like this: class DrawView extends View { Paint paint = new Paint(); public DrawView(Context context) { super(context); paint.setColor(Color.BLUE); } @Override public void onDraw(Canvas canvas) { super.onDraw(canvas); canvas.drawLine(10, 10, 90, 10, paint); } } Inside the onDraw() method, we will use all kinds of facilities that are provided by the Canvas class such as the different drawing methods made available by the Canvas class. We can also use drawing methods from other classes as well. The Android framework will draw a bitmap on the Canvas for us once our onDraw() method is complete with all our desired functionality. If we are using the main UI thread, we will call the invalidate() method, but if we are using another thread, then we will call the postInvalidate() method. Drawing on a SurfaceView The View class provides a subclass SurfaceView that provides a dedicated drawing surface within the hierarchy of the View. The goal is to draw using a secondary thread so that the application won't wait for the resources to be free and ready to redraw. The secondary thread has access to the SurfaceView object that has the ability to draw on its own Canvas with its own redraw frequency. We will start by creating a class that will extend the SurfaceView class. We should implement an interface SurfaceHolder.Callback. This interface is important in the sense that it will provide us with the information when a surface is created, modified, or destroyed. When we have timely information about the creation, change, or destruction of a surface, we can make a better decision on when to start drawing and when to stop. The secondary thread class that will perform all the drawing on our Canvas can also be defined in the SurfaceView class. To get information, the Surface object should be handled through SurfaceHolder and not directly. To do this, we will get the Holder by calling the getHolder() method when the SurfaceView is initialized. We will then tell the SurfaceHolder object that we want to receive all the callbacks; to do this, we will call addCallBacks(). After this, we will override all the methods inside the SurfaceView class to get our job done according to our functionality. The next step is to draw the surface's Canvas from inside the second thread; to do this, we will pass our SurfaceHandler object to the thread object and will get the Canvas using the lockCanvas() method. This will get the Canvas for us and will lock it for the drawing from the current thread only. We need to do this because we don't want an open Canvas that can be drawn by another thread; if this is the situation, it will disturb all our graphics and drawings on the Canvas. When we are done with drawing our graphics on the Canvas, we will unlock the Canvas by calling the unlockCanvasAndPost() method and will pass our Canvas object. To have a successful drawing, we will need repeated redraws; so we will repeat this locking and unlocking as needed and the surface will draw the Canvas. To have a uniform and smooth graphic animation, we need to have the previous state of the Canvas; so we will retrieve the Canvas from the SurfaceHolder object every time and the whole surface should be redrawn each time. If we don't do so, for instance, not painting the whole surface, the drawing from the previous Canvas will persist and that will destroy the whole look of our graphic-intense application. A sample code would be the following: class OurGameView extends SurfaceView implements SurfaceHolder.Callback { Thread thread = null; SurfaceHolder surfaceHolder; volatile boolean running = false; public void OurGameView (Context context) { super(context); surfaceHolder = getHolder(); } public void onResumeOurGameView (){ running = true; thread = new Thread(this); thread.start(); } public void onPauseOurGameView(){ boolean retry = true; running = false; while(retry){ thread.join(); retry = false; } public void run() { while(running){ if(surfaceHolder.getSurface().isValid()){ Canvas canvas = surfaceHolder.lockCanvas(); //... actual drawing on canvas surfaceHolder.unlockCanvasAndPost(canvas); } } } }
Read more
  • 0
  • 0
  • 5473

article-image-creating-user-interfaces
Packt
15 Oct 2013
6 min read
Save for later

Creating User Interfaces

Packt
15 Oct 2013
6 min read
When creating an Android application, we have to be aware of the existence of multiple screen sizes and screen resolutions. It is important to check how our layouts are displayed in different screen configurations. To accomplish this, Android Studio provides a functionality to change the layout preview when we are in the design mode. We can find this functionality in the toolbar, the device definition option used in the preview is by default Nexus 4. Click on it to open the list of available device definitions. Try some of them. The difference between a tablet device and a device like the Nexus one are very notable. We should adapt the views to all the screen configurations our application supports to ensure they are displayed optimally. The device definitions indicate the screen inches, the resolution, and the screen density. Android divides into ldpi, mdpi, hdpi, xhdpi, and even xxhdpi the screen densities. ldpi (low-density dots per inch): About 120 dpi mdpi (medium-density dots per inch): About 160 dpi hdpi (high-density dots per inch): About 240 dpi xhdpi (extra-high-density dots per inch): About 320 dpi xxhdpi (extra-extra-high-density dots per inch): About 480 dpi The last dashboards published by Google show that most devices have high-density screens (34.3 percent), followed by xhdpi (23.7 percent) and mdpi (23.5 percent). Therefore, we can cover 81.5 percent of the devices by testing our application using these three screen densities. Official Android dashboards are available at http://developer.android.com/about/dashboards. Another issue to keep in mind is the device orientation. Do we want to support the landscape mode in our application? If the answer is yes, we have to test our layouts in the landscape orientation. On the toolbar, click on the layout state option to change the mode from portrait to landscape or from landscape to portrait. In the case that our application supports the landscape mode and the layout does not display as expected in this orientation, we may want to create a variation of the layout. Click on the first icon of the toolbar, that is, the configuration option, and select the option Create Landscape Variation. A new layout will be opened in the editor. This layout has been created in the resources folder, under the directory layout-land and using the same name as the portrait layout: /src/main/res/layout-land/activity_main.xml. Now we can edit the new layout variation perfectly conformed to the landscape mode. Similarly, we can create a variation of the layout for xlarge screens. Select the option Create layout-xlarge Variation. The new layout will be created in the layout xlarge folder: /src/main/res/layout-xlarge/activity_main.xml. Android divides into small, normal, large, and xlarge the actual screen sizes: small: Screens classified in this category are at least 426 dp x 320 dp normal: Screens classified in this category are at least 470 dp x 320 dp large: Screens classified in this category are at least 640 dp x 480 dp xlarge: Screens classified in this category are at least 960 dp x 720 dp A dp is a density independent pixel, equivalent to one physical pixel on a 160 dpi screen. The last dashboards published by Google show that most devices have a normal screen size (79.6 percent). If you want to cover a bigger percentage of devices, test your application by also using a small screen (9.5 percent), so the coverage will be 89.1 percent of devices. To display multiple device configurations at the same time, in the toolbar click on the configuration option and select the option Preview All Screen Sizes, or click on the Preview Representative Sample to open just the most important screen sizes. We can also delete any of the samples by clicking on it using the right mouse button and selecting the Delete option of the menu. Another useful action of this menu is the Save screenshot option, which allows us to take a screenshot of the layout preview. If we create some layout variations, we can preview all of them selecting the option Preview Layout Versions. Changing the UI theme Layouts and widgets are created using the default UI theme of our project. We can change the appearance of the elements of the UI by creating styles. Styles can be grouped to create a theme and a theme can be applied to a whole activity or application. Some themes are provided by default, such as the Holo style. Styles and themes are created as resources under the /src/res/values folder. Open the main layout using the graphical editor. The selected theme for our layout is indicated in the toolbar: AppTheme. This theme was created for our project and can be found in the styles file (/src/res/values/styles.xml). Open the styles file and notice that this theme is an extension of another theme (Theme.Light). To custom our theme, edit the styles file. For example, add the next line in the AppTheme definition to change the window background color: <style name="AppTheme" parent="AppBaseTheme"> <item name="android:windowBackground">#dddddd</item> </style> Save the file and switch to the layout tab. The background is now light gray. This background color will be applied to all our layouts due to the fact that we configured it in the theme and not just in the layout. To completely change the layout theme, click on the theme option from the toolbar in the graphical editor. The theme selector dialog is now opened, displaying a list of the available themes. The themes created in our own project are listed in the Project Themes section. The section Manifest Themes shows the theme configured in the application manifest file (/src/main/AndroidManifest.xml). The All section lists all the available themes. Summary In this article, we have seen how to develop and build Android applications with this new IDE.uide. It also shows how to support multiple screens and change their properties using the SDK. The changing of UI themes for the devices is also discussed, along with its properties. The main focus was on the creation of the user interfaces using both the graphical view and the text-based view. Resources for Article: Further resources on this subject: Creating Dynamic UI with Android Fragments Building Android (Must know) Android Fragmentation Management
Read more
  • 0
  • 0
  • 2269

article-image-making-subtle-color-shifts-curves
Packt
23 Sep 2013
7 min read
Save for later

Making subtle color shifts with curves

Packt
23 Sep 2013
7 min read
(For more resources related to this topic, see here.) When looking at a scene, we may pick up subtle cues from the way colors shift between different image regions. For example, outdoors on a clear day, shadows have a slightly blue tint due to the ambient light reflected from the blue sky, while highlights have a slightly yellow tint because they are in direct sunlight. When we see bluish shadows and yellowish highlights in a photograph, we may get a "warm and sunny" feeling. This effect may be natural, or it may be exaggerated by a filter. Curve filters are useful for this type of manipulation. A curve filter is parameterized by sets of control points. For example, there might be one set of control points for each color channel. Each control point is a pair of numbers representing the input and output values for the given channel. For example, the pair (128, 180) means that a value of 128 in the given color channel is brightened to become a value of 180. Values between the control points are interpolated along a curve (hence the name, curve filter). In Gimp, a curve with the control points (0, 0), (128, 180), and (255, 255) is visualized as shown in the following screenshot: The x axis shows the input values ranging from 0 to 255, while the y axis shows the output values over the same range. Besides showing the curve, the graph shows the line y = x (no change) for comparison. Curvilinear interpolation helps to ensure that color transitions are smooth, not abrupt. Thus, a curve filter makes it relatively easy to create subtle, natural-looking effects. We may define an RGB curve filter in pseudocode as follows: dst.b = funcB(src.b) where funcB interpolates pointsB dst.g = funcG(src.g) where funcG interpolates pointsG dst.r = funcR(src.r) where funcR interpolates pointsR For now, we will work with RGB and RGBA curve filters, and with channel values that range from 0 to 255. If we want such a curve filter to produce natural-looking results, we should use the following rules of thumb: Every set of control points should include (0, 0) and (255, 255). This way, black remains black, white remains white, and the image does not appear to have an overall tint. As the input value increases, the output value should always increase too. (Their relationship should be monotonically increasing.) This way, shadows remain shadows, highlights remain highlights, and the image does not appear to have inconsistent lighting or contrast. OpenCV does not provide curvilinear interpolation functions but the Apache Commons Math library does. (See Adding files to the project, earlier in this chapter, for instructions on setting up Apache Commons Math.) This library provides interfaces called UnivariateInterpolator and UnivariateFunction, which have implementations including LinearInterpolator, SplineInterpolator, LinearFunction, and PolynomialSplineFunction. (Splines are a type of curve.) UnivariateInterpolator has an instance method, interpolate(double[] xval, double[] yval), which takes arrays of input and output values for the control points and returns a UnivariateFunction object. The UnivariateFunction object can provide interpolated values via the method value(double x). API documentation for Apache Commons Math is available at http://commons.apache.org/proper/commons-math/apidocs/. These interpolation functions are computationally expensive. We do not want to run them again and again for every channel of every pixel and every frame. Fortunately, we do not have to. There are only 256 possible input values per channel, so it is practical to precompute all possible output values and store them in a lookup table. For OpenCV's purposes, a lookup table is a Mat object whose indices represent input values and whose elements represent output values. The lookup can be performed using the static method Core.LUT(Mat src, Mat lut, Mat dst). In pseudocode, dst = lut[src]. The number of elements in lut should match the range of values in src, and the number of channels in lut should match the number of channels in src. Now, using Apache Commons Math and OpenCV, let's implement a curve filter for RGBA images with channel values ranging from 0 to 255. Open CurveFilter.java and write the following code: public class CurveFilter implements Filter { // The lookup table. private final Mat mLUT = new MatOfInt(); public CurveFilter( final double[] vValIn, final double[] vValOut, final double[] rValIn, final double[] rValOut, final double[] gValIn, final double[] gValOut, final double[] bValIn, final double[] bValOut) { // Create the interpolation functions. UnivariateFunction vFunc = newFunc(vValIn, vValOut); UnivariateFunction rFunc = newFunc(rValIn, rValOut); UnivariateFunction gFunc = newFunc(gValIn, gValOut); UnivariateFunction bFunc = newFunc(bValIn, bValOut); // Create and populate the lookup table. mLUT.create(256, 1, CvType.CV_8UC4); for (int i = 0; i < 256; i++) { final double v = vFunc.value(i); final double r = rFunc.value(v); final double g = gFunc.value(v); final double b = bFunc.value(v); mLUT.put(i, 0, r, g, b, i); // alpha is unchanged } } @Override public void apply(final Mat src, final Mat dst) { // Apply the lookup table. Core.LUT(src, mLUT, dst); } private UnivariateFunction newFunc(final double[] valIn, final double[] valOut) { UnivariateInterpolator interpolator; if (valIn.length > 2) { interpolator = new SplineInterpolator(); } else { interpolator = new LinearInterpolator(); } return interpolator.interpolate(valIn, valOut); } } CurveFilter stores the lookup table in a member variable. The constructor method populates the lookup table based on the four sets of control points that are taken as arguments. As well as a set of control points for each of the RGB channels, the constructor also takes a set of control points for the image's overall brightness, just for convenience. A helper method, newFunc, creates an appropriate interpolation function (linear or spline) for each set of control points. Then, we iterate over the possible input values and populate the lookup table. The apply method is a one-liner. It simply uses the precomputed lookup table with the given source and destination matrices. CurveFilter can be subclassed to define a filter with a specific set of control points. For example, let's open PortraCurveFilter.java and write the following code: public class PortraCurveFilter extends CurveFilter { public PortraCurveFilter() { super( new double[] { 0, 23, 157, 255 }, // vValIn new double[] { 0, 20, 173, 255 }, // vValOut new double[] { 0, 69, 213, 255 }, // rValIn new double[] { 0, 69, 218, 255 }, // rValOut new double[] { 0, 52, 189, 255 }, // gValIn new double[] { 0, 47, 196, 255 }, // gValOut new double[] { 0, 41, 231, 255 }, // bValIn new double[] { 0, 46, 228, 255 }); // bValOut } } This filter brightens the image, makes shadows cooler (more blue), and makes highlights warmer (more yellow). It produces flattering skin tones and tends to make things look sunnier and cleaner. It resembles the color characteristics of a brand of photo film called Kodak Portra, which was often used for portraits. The code for our other three channel mixing filters is similar. The ProviaCurveFilter class uses the following arguments for its control points: new double[] { 0, 255 }, // vValIn new double[] { 0, 255 }, // vValOut new double[] { 0, 59, 202, 255 }, // rValIn new double[] { 0, 54, 210, 255 }, // rValOut new double[] { 0, 27, 196, 255 }, // gValIn new double[] { 0, 21, 207, 255 }, // gValOut new double[] { 0, 35, 205, 255 }, // bValIn new double[] { 0, 25, 227, 255 }); // bValOut The effect is a strong, blue or greenish-blue tint in shadows and a strong, yellow or greenish-yellow tint in highlights. It resembles a film processing technique called cross-processing, which was sometimes used to produce grungy-looking photos of fashion models, pop stars, and so on. For a good discussion of how to emulate various brands of photo film, see Petteri Sulonen's blog at http://www.prime-junta.net/pont/How_to/100_Curves_and_Films/_Curves_and_films.html. The control points that we use are based on examples given in this article. Curve filters are a convenient tool for manipulating color and contrast, but they are limited insofar as each destination pixel is affected by only a single input pixel. Next, we will examine a more flexible family of filters, which enable each destination pixel to be affected by a neighborhood of input pixels. Summary In this article we learned how to make subtle color shifts with curves. Resources for Article: Further resources on this subject: Linking OpenCV to an iOS project [Article] A quick start – OpenCV fundamentals [Article] OpenCV: Image Processing using Morphological Filters [Article]
Read more
  • 0
  • 0
  • 3025

article-image-introducing-android-platform
Packt
12 Sep 2013
9 min read
Save for later

Introducing an Android platform

Packt
12 Sep 2013
9 min read
(For more resources related to this topic, see here.) Introducing an Android app Mobile software application that runs on Android is an Android app. The apps use the extension of .apk as the installer file extension. There are several popular examples of mobile apps, such as Foursquare, Angry Birds, and Fruit Ninja. Primarily in an Eclipse environment, we use Java, which is then compiled into Dalvik bytecode (not the ordinary Java bytecode). Android provides Dalvik virtual machine (DVM) inside Android (not Java virtual machine JVM). Dalvik VM does not ally with Java SE and Java ME libraries, and is built on an Apache Harmony java implementation. What is Dalvik virtual machine? Dalvik VM is a register-based architecture, authored by Dan Bornstein. It is being optimized for low memory requirements, and the virtual machine was slimmed down to use less space and less power consumption. Preparing for Android development Eclipse ADT In this part of the article, we will see how to install the development environment for Android on Eclipse Juno (4.2). Eclipse is a major IDE for Android development. We need to install an Eclipse extension ADT Android Development Toolkit (ADT) for development of the Android application. Debugging an Android project It is advisable to use the Log class for this purpose, the reason being we can filter, print different colors, and define log types. This could be one of the ways of debugging your program, by displaying variables value or parameters. To use Log, import android.util.Log, and use one the following methods to print messages to LogCat: v(String, String) (verbose) d(String, String) (debug) i(String, String) (information) w(String, String) (warning) e(String, String) (error) LogCat is used to view the internal log of the Android system. It is useful to trace any activity happening inside the device or emulator through the Android Debug Bridge (ADB). The Android project structure The following table illustrates the brief description of the important folders and files available in an Android project: Folder Functions /src The Java codes are placed in this folder. /gen It is generated automatically. /assets You can put your fonts, videos, and sounds here. It is more like a filesystem, and can also place CSS, JavaScript files, and so on. /libs It is an external library (normally in JAR). /res It contains images, layout, and global variables. /drawable-xhdpi It is used for extra high specification devices (for example, Tablet, Galaxy SIII, HTC One X). /drawable-hdpi  It is used for high specification phones (for example, SGSI, SGSII) /drawable-mdpi It is used for medium specification phones (for example, Galaxy W and HTC Desire). /drawable-ldpi  It is used for low specification phones (for example: Galaxy Y and HTC WildFire). /layout  It includes all the XML file for the screen(s) layout. /menu XML files for screen menu. /values It includes global constants. /values-v11 These are template styles definitions for devices with Honeycomb (Android API level 11). /values-v14 These are template styles definitions for devices with ICS (Android API level 14). AndroidManifest.xml This is one of important files to define the apps. This is the first file located by the Android OS in order to run the app. It contains the app's properties, activity declarations, and list of permissions.   Dalvik Debug Monitor Server (DDMS) DDMS is a must have tool to view the emulator/device activities. To access DDMS in the Eclipse, navigate to Windows | Open Perspective | Other, and then choose DDMS. By default it is available in the Android SDK (it's inside the folder android-sdk/tools by the file ddms). From this perspective the following aspects are available: Devices: The list of the devices and AVD that are connected to ADB Emulator control: It helps to carry out device functions LogCat: It views real-time system log messages Threads: It gives an idea of currently running threads within a VM Heap: It shows heap usage by application Allocation tracker: It provides information on memory allocation of objects File explorer: It explores the device filesystem Creating a new Android project using Eclipse ADT To create a new Android project in Eclipse, navigate to File | New | Project. A new project window will appear, from there choose Android | Android Application Project from the list. Then click on the Next button. Application Name: This is the name of your application, it will appear side-by-side to the launcher icon. Choose a project name that relevant to your application. Project Name: This is typically similar to your application name. Avoid having the same name with existing projects in Eclipse, it is not permitted. Package Name: This is the package name of the application. It will act as an ID in the Google Play app store if we wish to publish. Typically it will be the reverse of your domain name if we have one (since this is unique) followed by the application name, and a valid Java package name else we can have anything now and refactor it before publishing. Running the application on an Android device To run and deploy on real device, first install the driver of the device. This varies as per device model and manufacturer. These are a few links you could refer to: For Google Android devices refer to http://developer.android.com/sdk/win-usb.html. For others refer to http://www.teamandroid.com/download-android-usb-drivers/. Make sure the Android phone is connected to the computer through the USB cable. To check whether the phone is properly connected to your PC and in debug mode, please switch to the DDMS perspective. Adding multiple activity in Android application This exercise is to add an information screen on the SimpleNumb3r5 app. The information regarding the developer, e-mail, Facebook fan page, and other information is displayed. Since the screen contains a lot of text information including several pictures, so we make use of an HTML page as our approach here: Create an activity class to handle the new screen. Open the src folder, right-click on the package name (net.kerul.SimpleNumb3r5), and choose New | Other... from the selections, choose to add a new Android activity, and click on the Next button. Then, choose a blank activity and click on Next. Set the activity name as Info, and the wizard will suggest the screen layout as info_activity. Click on the Finish button. Adding the RadioGroup or RadioButton controls Android SDK provides two types of radio controls to be used in conjunction, where only one control can be chosen at a given time. RadioGroup (Android widget RadioGroup) is used to encapsulate a set of RadioButton controls for this purpose. Defining the preference screen Preferences are an important aspect of the Android applications. It allows users to have the choice to modify and personalize it. Preferences can be set in two ways: first method is to create the preferences.xml file in the res/xml directory, and second method is to set the preferences from the code. We will use the former also the easier one, by creating the preferences.xml file. Usually, there are five different preference views as listed in the following table: Views Description CheckBoxPreference It is a simple checkbox which returns true/false ListPreference It shows RadioGroup, where only 1 item can be selected EditTextPreference It shows dialog box to edit TextView, and returns String RingTonePreference It is a radioGroup that shows ringtone PreferenceCategory It is a category with preferences Fragment A fragment is an independent component that can be connected to an Activity or simply is subactivity. Typically it defines a part of UI but can also exist with no user interface, that is, headless. An instance of fragment must exist within an activity. Fragments ease the reuse of components for different layouts. Fragments are the way to support UI variances across different types of screens. The most popular use is of building single pane layouts for phone and multipane layouts for tablets (large screens). Adding an external library Android project – AdMob An Android application cannot achieve everything on its own, it will always need the company of external jars/libraries to achieve different goals and serve various purposes. Almost every free Android application published on store has advertisement embedded in it, which makes use of external component to achieve it. Incorporating advertisement in the Android application is a vital aspect of today's application development. In this article, we will continue on our DistanceConverter application, and make use of the external library, AdMob to incorporate advertisement in our application. Adding the AdMob SDK to the project Let's extract the previously downloaded AdMob SDK zip file, and we should get the folder GoogleAdMobAdsSdkAndroid-6.*.*, under that folder there is GoogleAdMobAdsSdk-6.x.x.jar. You should copy this JAR file in the libs folder of the project. Signing and distributing APK The Android package (APK), in simple terms is similar to the runnable JAR or executable file (on Windows OS) which consists of everything that is needed to run the application. The Android ecosystem uses a virtual machine, that is, Dalvik virtual machine (DVM) to run the Java applications. Dalvik uses its own bytecode which is quite different from the Java bytecode. Generating a private key An android application must be signed with our own private key. It identifies a person, corporate, or entity associated with the application. This can be generated using the program keytool from the Java SDK. The following command is used for generating the key: keytool -genkey -v -keystore <filename>.keystore -alias <key-name> -keyalg RSA -keysize 2048 -validity 10000 We can use different key for each published application, and specify different name to identify it. Also, Google expects validity of at least 25 years or more. A very important thing to consider is to keep back up and securely store key, because once it is compromised it impossible to update an already published application. Publishing to Google Play Publishing at Google Play is very simple and involves register for Google play. You just have to visit and register it at https://play.google.com/. It requires $25 USD to register, and is fairly straight and can take a few days until you get the final access. Summary In this article, we learned how to install the Eclipse Juno (the IDE), the Android SDK and the testing platform. Also, we learned about the fragment and its usage, and used it to have different layouts for landscape mode for our application DistanceConverter. We also learned about handling different screen types and persisting state during screen mode changes. Resources for Article: Further resources on this subject: Installing Alfresco Software Development Kit (SDK) [Article] JBoss AS plug-in and the Eclipse Web Tools Platform [Article] Creating a pop-up menu [Article]
Read more
  • 0
  • 0
  • 1812

article-image-creating-sample-application-simple
Packt
04 Sep 2013
8 min read
Save for later

Creating a sample application (Simple)

Packt
04 Sep 2013
8 min read
(For more resources related to this topic, see here.) How to do it... To create an application, include the JavaScript and CSS files in your page. Perform the following steps: Create an HTML document, index.html, under your project directory. Please note that this directory should be placed in the web root of your web server. Create the directories styles and scripts under your project directory. Copy the CSS file kendo.mobile.all.min.css, from <downloaded directory>/styles to the styles directory created in step 2. Then add a reference to the CSS file in the head section of the document. Download the jQuery library from jQuery.com. Place this file in the scripts directory and add a reference to this file in the document before closing the body tag. You can also specify the CDN location of the file in the document. Copy the JavaScript file kendo.mobile.min.js, from the <downloaded directory>/js tag to the scripts directory created in step 2. Then add a reference to this JavaScript file in the document (after jQuery). Add the text "Hello Kendo!!" in the body tag of the index.html file as follows: <!DOCTYPE HTML><html><head><title>My first Kendo Mobile Application</title><link rel="stylesheet"type="text/css"href="styles/kendo.mobile.all.min.css"></head><body>Hello Kendo!!<script type="text/javascript"src = "scripts/jquery.min.js"></script><script type="text/javascript"src = "scripts/kendo.mobile.min.js"></script></body></html> The preceding code snippet is a simple HTML page with references to Kendo Mobile CSS and JavaScript files. These files are minified and contain all the features, themes, and widgets. In production, you would like to include only those that are required. The downloaded ZIP file includes CSS and JavaScript files for specific features. However, in development you can use the minified files that contain all features. Another thing to note is that apart from the reference to the script kendo.mobile.min.js, the page also includes a reference to jQuery. It is the only external dependency for Kendo UI. When you view this page on a mobile device, you will see the text Hello Kendo!! shown. This page does not include any of the widgets that come as a part of the library. Now let's build on top of our Hello World application and add some visual elements; that is, UI widgets to the page. This can be done with the following steps: Add a layout first. A mobile application generally has a header, a footer, and multiple views. It is also observed that while navigating through different views in an application, the header and footer remain constant. The framework allows you to define a global layout that may contain a header and a footer for all the views in the application. Also, the framework allows you to define multiple views that can share the same layout. The following is the same page that now includes a header and footer defined in the layout: <body><div data-role="layout" data-id="defaultLayout"> <header data-role="header"> <div data-role="navbar"> My first application </div> </header> <footer data-role="footer"> <div data-role="tabstrip"> <a data-icon="about">About</a> <a data-icon="settings">Settings</a> </div> </footer> </div></body> The body contains a few div tags with data attributes. Let's look into one of these tags in detail. <div data-role="layout" data-id="defaultLayout"> Here, the div tag contains two data attributes, role and id. The role data attribute is used to initialize and configure a widget. The data-role attribute has a value, layout, identifying the target element as a layout widget. All the widgets are expected to have a role data attribute that helps in marking the target element for a specific purpose. It instructs the library which widget needs to be added to the page. The id data attribute is used to identify the widget (the layout widget) in the page. A page may define several layout widgets and each one of these must be identified by a unique ID. Here, the data-id attribute has defaultLayout as its value. Now there can be many views referring to this layout by its id. Similarly, there are other elements in the page with the data-role attribute, defining them as one of widgets in the page. Let's take a look at the header and footer widgets defined inside the layout. <header data-role="header">... </header><footer data-role="footer">...</footer> The header and footer tags have the role data attribute set to header and footer respectively. This aligns them to the top and bottom of the page, giving the rest of the available space for different views to render. Also, note that there is a navbar widget in the header and a tabstrip widget defined in the footer. As mentioned earlier, the framework comes with several widgets that can help you build the application rapidly. Now add views to the page. The index.html page now has a layout defined and when you run the page in the browser, you will see an error message in the console which says: Uncaught Error: Your kendo mobile application element does not contain any direct child elements with data-role="view" attribute set. Make sure that you instantiate the mobile application using the correct container. Views represent the actual content that has to be displayed between the header and the footer that we defined while creating a layout. A layout cannot exist without a view and hence you see that error message in the console. To fix this error, you need to define a view for your mobile application. Add the following to your index.html page: <div data-role="view" data-layout="defaultLayout"> Hello Kendo!!</div> As mentioned earlier, every widget needs to have a role data attribute to identify itself as a particular widget in the page. Here, the target element is defined as a view widget and tied to the layout by defining the data-layout attribute. The data-layout attribute has a value defaultLayout that is the same as the value for the data-id attribute of the layout that we defined earlier. This attaches the view to the layout and you will not see the error message anymore. Similarly, you can have multiple Views defined in the page that can make use of the same layout. Now, there's only one pending task for the application to start working: initializing the application. A Kendo Mobile application can be initialized using the Application object. To do that, add the following code to the page: <script> var app = new kendo.mobile.Application();</script> Include the previous script block right after references to jQuery and Kendo Mobile and before closing the body tag. This single line of JavaScript code will initialize your Kendo Mobile application and all the widgets with the data-role attribute. The Application object is used for many other purposes . How it works... When you run the index.html page in a browser, you will see a navbar and a tabstrip in the header and footer of the page. Also, the message Hello Kendo!! being shown in the body of the page. The following screenshot shows how it will look like when you view the page on an iPhone: If you have noticed, this looks like a native iOS application. The framework has the capability to render the application that looks like a native application on a device. When you view the same page on an Android device, it will look like an native Android application, as shown in the following screenshot: The framework identifies the platform on which the mobile application is being run and then provides native look and feel to the application. There are ways in which you can customize this behavior. Summary Creating a sample application (Simple)got us started with the Kendo UI Mobile framework and showed us how to build a sample application using the same. We also saw some of the Mobile UI widgets, such as layouts, views, navbar, and tabstrip in brief. Resources for Article : Further resources on this subject: Working with remote data [Article] The Decider: External APIs [Article] Constructing and Evaluating Your Design Solution [Article]
Read more
  • 0
  • 0
  • 1325

article-image-android-native-application-api
Packt
13 May 2013
21 min read
Save for later

Android Native Application API

Packt
13 May 2013
21 min read
(For more resources related to this topic, see here.) Based on the features provided by the functions defined in these header files, the APIs can be grouped as follows: Activity lifecycle management: native_activity.h looper.h Windows management: rect.h window.h native_window.h native_window_jni.h Input (including key and motion events) and sensor events: input.h keycodes.h sensor.h Assets, configuration, and storage management: configuration.h asset_manager.h asset_manager_jni.h storage_manager.h obb.h In addition, Android NDK also provides a static library named native app glue to help create and manage native activities. The source code of this library can be found under the sources/android/native_app_glue/ directory. In this article, we will first introduce the creation of a native activity with the simple callback model provided by native_acitivity.h, and the more complicated but flexible two-threaded model enabled by the native app glue library. We will then discuss window management at Android NDK, where we will draw something on the screen from the native code. Input events handling and sensor accessing are introduced next. Lastly, we will introduce asset management, which manages the files under the assets folder of our project. Note that the APIs covered in this article can be used to get rid of the Java code completely, but we don't have to do so. The Managing assets at Android NDK recipe provides an example of using the asset management API in a mixed-code Android project. Before we start, it is important to keep in mind that although no Java code is needed in a native activity, the Android application still runs on Dalvik VM, and a lot of Android platform features are accessed through JNI. The Android native application API just hides the Java world for us. Creating a native activity with the native_activity.h interface The Android native application API allows us to create a native activity, which makes writing Android apps in pure native code possible. This recipe introduces how to write a simple Android application with pure C/C++ code. Getting ready Readers are expected to have basic understanding of how to invoke JNI functions. How to do it… The following steps to create a simple Android NDK application without a single line of Java code: Create an Android application named NativeActivityOne. Set the package name as cookbook.chapter5.nativeactivityone. Right-click on the NativeActivityOne project, select Android Tools | Add Native Support. Change the AndroidManifest.xml file as follows: <manifest package="cookbook.chapter5.nativeactivityone"android:versionCode="1"android:versionName="1.0"><uses-sdk android_minSdkVersion="9"/><application android_label="@string/app_name"android:icon="@drawable/ic_launcher"android:hasCode="true"><activity android_name="android.app.NativeActivity"android:label="@string/app_name"android:configChanges="orientation|keyboardHidden"><meta-data android_name="android.app.lib_name"android:value="NativeActivityOne" /><intent-filter><action android_name="android.intent.action.MAIN" /><category android_name="android.intent.category.LAUNCHER" /></intent-filter></activity></application></manifest> We should ensure that the following are set correctly in the preceding file: The activity name must be set to android.app.NativeActivity. The value of the android.app.lib_name metadata must be set to the native module name without the lib prefix and .so suffix. android:hasCode needs to be set to true, which indicates that the application contains code. Note that the documentation in <NDK root>/docs/NATIVE-ACTIVITY.HTML gives an example of the AndroidManifest.xml file with android:hasCode set to false, which will not allow the application to start. Add two files named NativeActivityOne.cpp and mylog.h under the jni folder. The ANativeActivity_onCreate method should be implemented in NativeActivityOne.cpp. The following is an example of the implementation: void ANativeActivity_onCreate(ANativeActivity* activity,void* savedState, size_t savedStateSize) {printInfo(activity);activity->callbacks->onStart = onStart;activity->callbacks->onResume = onResume;activity->callbacks->onSaveInstanceState = onSaveInstanceState;activity->callbacks->onPause = onPause;activity->callbacks->onStop = onStop;activity->callbacks->onDestroy = onDestroy;activity->callbacks->onWindowFocusChanged =onWindowFocusChanged;activity->callbacks->onNativeWindowCreated =onNativeWindowCreated;activity->callbacks->onNativeWindowResized =onNativeWindowResized;activity->callbacks->onNativeWindowRedrawNeeded =onNativeWindowRedrawNeeded;activity->callbacks->onNativeWindowDestroyed =onNativeWindowDestroyed;activity->callbacks->onInputQueueCreated = onInputQueueCreated;activity->callbacks->onInputQueueDestroyed =onInputQueueDestroyed;activity->callbacks->onContentRectChanged =onContentRectChanged;activity->callbacks->onConfigurationChanged =onConfigurationChanged;activity->callbacks->onLowMemory = onLowMemory;activity->instance = NULL;} Add the Android.mk file under the jni folder: LOCAL_PATH := $(call my-dir)include $(CLEAR_VARS)LOCAL_MODULE := NativeActivityOneLOCAL_SRC_FILES := NativeActivityOne.cppLOCAL_LDLIBS := -landroid -lloginclude $(BUILD_SHARED_LIBRARY) Build the Android application and run it on an emulator or a device. Start a terminal and display the logcat output using the following: $ adb logcat -v time NativeActivityOne:I *:S Alternatively, you can use the logcat view at Eclipse to see the logcat output. When the application starts, you should be able to see the following logcat output: As shown in the screenshot, a few Android activity lifecycle callback functions are executed. We can manipulate the phone to cause other callbacks being executed. For example, long pressing the home button and then pressing the back button will cause the onWindowFocusChanged callback to be executed. How it works… In our example, we created a simple, "pure" native application to output logs when the Android framework calls into the callback functions defined by us. The "pure" native application is not really pure native. Although we did not write a single line of Java code, the Android framework still runs some Java code on Dalvik VM. Android framework provides an android.app.NativeActivity.java class to help us create a "native" activity. In a typical Java activity, we extend android.app.Activity and overwrite the activity lifecycle methods. NativeActivity is also a subclass of android. app.Activity and does similar things. At the start of a native activity, NativeActivity. java will call ANativeActivity_onCreate, which is declared in native_activity.h and implemented by us. In the ANativeActivity_onCreate method, we can register our callback methods to handle activity lifecycle events and user inputs. At runtime, NativeActivity will invoke these native callback methods when the corresponding events occurred. In a word, NativeActivity is a wrapper that hides the managed Android Java world for our native code, and exposes the native interfaces defined in native_activity.h. The ANativeActivity data structure: Every callback method in the native code accepts an instance of the ANativeActivity structure. Android NDK defines the ANativeActivity data structure in native_acitivity.h as follows: typedef struct ANativeActivity {struct ANativeActivityCallbacks* callbacks;JavaVM* vm;JNIEnv* env;jobject clazz;const char* internalDataPath;const char* externalDataPath;int32_t sdkVersion;void* instance;AAssetManager* assetManager;} ANativeActivity; The various attributes of the preceding code are explained as follows: callbacks: It is a data structure that defines all the callbacks that the Android framework will invoke with the main UI thread. vm: It is the application process' global Java VM handle. It is used in some JNI functions. env: It is a JNIEnv interface pointer. JNIEnv is used through local storage data , so this field is only accessible through the main UI thread. clazz: It is a reference to the android.app.NativeActivity object created by the Android framework. It can be used to access fields and methods in the android. app.NativeActivity Java class. In our code, we accessed the toString method of android.app.NativeActivity. internalDataPath: It is the internal data directory path for the application. externalDataPath: It is the external data directory path for the application. internalDataPath and externalDataPath are NULL at Android 2.3.x. This is a known bug and has been fixed since Android 3.0. If we are targeting devices lower than Android 3.0, then we need to find other ways to get the internal and external data directories. sdkVersion: It is the Android platform's SDK version code. Note that this refers to the version of the device/emulator that runs the app, not the SDK version used in our development. instance: It is not used by the framework. We can use it to store user-defined data and pass it around. assetManager: It is the a pointer to the app's instance of the asset manager. We will need it to access assets data. We will discuss it in more detail in the Managing assets at Android NDK recipe of this article There's more… The native_activity.h interface provides a simple single thread callback mechanism, which allows us to write an activity without Java code. However, this single thread approach infers that we must quickly return from our native callback methods. Otherwise, the application will become unresponsive to user actions (for example, when we touch the screen or press the Menu button, the app does not respond because the GUI thread is busy executing the callback function). A way to solve this issue is to use multiple threads. For example, many games take a few seconds to load. We will need to offload the loading to a background thread, so that the UI can display the loading progress and be responsive to user inputs. Android NDK comes with a static library named android_native_app_glue to help us in handling such cases. The details of this library are covered in the Creating a native activity with the Android native app glue recipe. A similar problem exists at Java activity. For example, if we write a Java activity that searches the entire device for pictures at onCreate, the application will become unresponsive. We can use AsyncTask to search and load pictures in the background, and let the main UI thread display a progress bar and respond to user inputs. Creating a native activity with the Android native app glue The previous recipe described how the interface defined in native_activity.h allows us to create native activity. However, all the callbacks defined are invoked with the main UI thread, which means we cannot do heavy processing in the callbacks. Android SDK provides AsyncTask, Handler, Runnable, Thread, and so on, to help us handle things in the background and communicate with the main UI thread. Android NDK provides a static library named android_native_app_glue to help us execute callback functions and handle user inputs in a separate thread. This recipe will discuss the android_native_app_glue library in detail. Getting ready The android_native_app_glue library is built on top of the native_activity.h interface. Therefore, readers are recommended to read the Creating a native activity with the native_activity.h interface recipe before going through this one. How to do it… The following steps create a simple Android NDK application based on the android_native_app_glue library: Create an Android application named NativeActivityTwo. Set the package name as cookbook.chapter5.nativeactivitytwo. Right-click on the NativeActivityTwo project, select Android Tools | Add Native Support. Change the AndroidManifest.xml file as follows: <manifest package="cookbook.chapter5.nativeactivitytwo"android:versionCode="1"android:versionName="1.0"><uses-sdk android_minSdkVersion="9"/><application android_label="@string/app_name"android:icon="@drawable/ic_launcher"android:hasCode="true"><activity android_name="android.app.NativeActivity"android:label="@string/app_name"android:configChanges="orientation|keyboardHidden"><meta-data android_name="android.app.lib_name"android:value="NativeActivityTwo" /><intent-filter><action android_name="android.intent.action.MAIN" /><category android_name="android.intent.category.LAUNCHER" /></intent-filter></activity></application></manifest> Add two files named NativeActivityTwo.cpp and mylog.h under the jni folder. NativeActivityTwo.cpp is shown as follows: #include <jni.h>#include <android_native_app_glue.h>#include "mylog.h"void handle_activity_lifecycle_events(struct android_app* app,int32_t cmd) {LOGI(2, "%d: dummy data %d", cmd, *((int*)(app->userData)));}void android_main(struct android_app* app) {app_dummy(); // Make sure glue isn't stripped.int dummyData = 111;app->userData = &dummyData;app->onAppCmd = handle_activity_lifecycle_events;while (1) {int ident, events;struct android_poll_source* source;if ((ident=ALooper_pollAll(-1, NULL, &events, (void**)&source)) >=0) {source->process(app, source);}}} Add the Android.mk file under the jni folder: LOCAL_PATH := $(call my-dir)include $(CLEAR_VARS)LOCAL_MODULE := NativeActivityTwoLOCAL_SRC_FILES := NativeActivityTwo.cppLOCAL_LDLIBS := -llog -landroidLOCAL_STATIC_LIBRARIES := android_native_app_glueinclude $(BUILD_SHARED_LIBRARY)$(call import-module,android/native_app_glue) Build the Android application and run it on an emulator or device. Start a terminal and display the logcat output by using the following command: adb logcat -v time NativeActivityTwo:I *:S When the application starts, you should be able to see the following logcat output and the device screen will shows a black screen: On pressing the back button, the following output will be shown: How it works… This recipe demonstrates how the android_native_app_glue library is used to create a native activity. The following steps should be followed to use the android_native_app_glue library: Implement a function named android_main. This function should implement an event loop, which will poll for events continuously. This method will run in the background thread created by the library. Two event queues are attached to the background thread by default, including the activity lifecycle event queue and the input event queue. When polling events using the looper created by the library, you can identify where the event is coming from, by checking the returned identifier (either LOOPER_ID_MAIN or LOOPER_ID_INPUT). It is also possible to attach additional event queues to the background thread. When an event is returned, the data pointer will point to an android_poll_source data structure. We can call the process function of this structure. The process is a function pointer, which points to android_app->onAppCmd for activity lifecycle events, and android_app->onInputEvent for input events. We can provide our own processing functions and direct the corresponding function pointers to these functions. In our example, we implement a simple function named handle_activity_lifecycle_ events and point the android_app->onAppCmd function pointer to it. This function simply prints the cmd value and the user data passed along with the android_app data structure. cmd is defined in android_native_app_glue.h as an enum. For example, when the app starts, the cmd values are 10, 11, 0, 1, and 6, which correspond to APP_CMD_START, APP_CMD_RESUME, APP_CMD_INPUT_CHANGED, APP_CMD_INIT_WINDOW, and APP_CMD_ GAINED_FOCUS respectively. android_native_app_glue Library Internals: The source code of the android_native_ app_glue library can be found under the sources/android/native_app_glue folder of Android NDK. It only consists of two files, namely android_native_app_glue.c and android_native_app_glue.h. Let's first describe the flow of the code and then discuss some important aspects in detail. Since the source code for native_app_glue is provided, we can modify it if necessary, although in most cases it won't be necessary. android_native_app_glue is built on top of the native_activity.h interface. As shown in the following code (extracted from sources/android/native_app_glue/ android_native_app_glue.c). It implements the ANativeActivity_onCreate function, where it registers the callback functions and calls the android_app_create function. Note that the returned android_app instance is pointed by the instance field of the native activity, which can be passed to various callback functions: void ANativeActivity_onCreate(ANativeActivity* activity,void* savedState, size_t savedStateSize) {LOGV("Creating: %pn", activity);activity->callbacks->onDestroy = onDestroy;activity->callbacks->onStart = onStart;activity->callbacks->onResume = onResume;… …activity->callbacks->onNativeWindowCreated =onNativeWindowCreated;activity->callbacks->onNativeWindowDestroyed =onNativeWindowDestroyed;activity->callbacks->onInputQueueCreated = onInputQueueCreated;activity->callbacks->onInputQueueDestroyed =onInputQueueDestroyed;activity->instance = android_app_create(activity, savedState,savedStateSize);} The android_app_create function (shown in the following code snippet) initializes an instance of the android_app data structure, which is defined in android_native_app_ glue.h. This function creates a unidirectional pipe for inter-thread communication. After that, it spawns a new thread (let's call it background thread thereafter) to run the android_ app_entry function with the initialized android_app data as the input argument. The main thread will wait for the background thread to start and then return: static struct android_app* android_app_create(ANativeActivity*activity, void* savedState, size_t savedStateSize) {struct android_app* android_app = (struct android_app*)malloc(sizeof(struct android_app));memset(android_app, 0, sizeof(struct android_app));android_app->activity = activity;pthread_mutex_init(&android_app->mutex, NULL);pthread_cond_init(&android_app->cond, NULL);……int msgpipe[2];if (pipe(msgpipe)) {LOGE("could not create pipe: %s", strerror(errno));return NULL;}android_app->msgread = msgpipe[0];android_app->msgwrite = msgpipe[1];pthread_attr_t attr;pthread_attr_init(&attr);pthread_attr_setdetachstate(&attr, PTHREAD_CREATE_DETACHED);pthread_create(&android_app->thread, &attr, android_app_entry,android_app);// Wait for thread to start.pthread_mutex_lock(&android_app->mutex);while (!android_app->running) {pthread_cond_wait(&android_app->cond, &android_app->mutex);}pthread_mutex_unlock(&android_app->mutex);return android_app;} The background thread starts with the android_app_entry function (as shown in the following code snippet), where a looper is created. Two event queues will be attached to the looper. The activity lifecycle events queue is attached to the android_app_entry function. When the activity's input queue is created, the input queue is attached (to the android_ app_pre_exec_cmd function of android_native_app_glue.c). After attaching the activity lifecycle event queue, the background thread signals the main thread it is already running. It then calls a function named android_main with the android_app data. android_main is the function we need to implement, as shown in our sample code. It must run in a loop until the activity exits: static void* android_app_entry(void* param) {struct android_app* android_app = (struct android_app*)param;… …//Attach life cycle event queue with identifier LOOPER_ID_MAINandroid_app->cmdPollSource.id = LOOPER_ID_MAIN;android_app->cmdPollSource.app = android_app;android_app->cmdPollSource.process = process_cmd;android_app->inputPollSource.id = LOOPER_ID_INPUT;android_app->inputPollSource.app = android_app;android_app->inputPollSource.process = process_input;ALooper* looper = ALooper_prepare(ALOOPER_PREPARE_ALLOW_NON_CALLBACKS);ALooper_addFd(looper, android_app->msgread, LOOPER_ID_MAIN,ALOOPER_EVENT_INPUT, NULL, &android_app->cmdPollSource);android_app->looper = looper;pthread_mutex_lock(&android_app->mutex);android_app->running = 1;pthread_cond_broadcast(&android_app->cond);pthread_mutex_unlock(&android_app->mutex);android_main(android_app);android_app_destroy(android_app);return NULL;} The following diagram indicates how the main and background thread work together to create the multi-threaded native activity: We use the activity lifecycle event queue as an example. The main thread invokes the callback functions, which simply writes to the write end of the pipe, while true loop implemented in the android_main function will poll for events. Once an event is detected, the function calls the event handler, which reads the exact command from the read end of the pipe and handles it. The android_native_app_glue library implements all the main thread stuff and part of the background thread stuff for us. We only need to supply the polling loop and the event handler as illustrated in our sample code. Pipe: The main thread creates a unidirectional pipe in the android_app_create function by calling the pipe method. This method accepts an array of two integers. After the function is returned, the first integer will be set as the file descriptor referring to the read end of the pipe, while the second integer will be set as the file descriptor referring to the write end of the pipe. A pipe is usually used for Inter-process Communication (IPC), but here it is used for communication between the main UI thread and the background thread created at android_ app_entry. When an activity lifecycle event occurs, the main thread will execute the corresponding callback function registered at ANativeActivity_onCreate. The callback function simply writes a command to the write end of the pipe and then waits for a signal from the background thread. The background thread is supposed to poll for events continuously and once it detects a lifecycle event, it will read the exact event from the read end of the pipe, signal the main thread to unblock and handle the events. Because the signal is sent right after receiving the command and before actual processing of the events, the main thread can return from the callback function quickly without worrying about the possible long processing of the events. Different operating systems have different implementations for the pipe. The pipe implemented by Android system is "half-duplex", where communication is unidirectional. That is, one file descriptor can only write, and the other file descriptor can only read. Pipes in some operating system is "full-duplex", where the two file descriptors can both read and write. Looper is an event tracking facility, which allows us to attach one or more event queues for an event loop of a thread. Each event queue has an associated file descriptor. An event is data available on a file descriptor. In order to use a looper, we need to include the android/ looper.h header file. The library attaches two event queues for the event loop to be created by us in the background thread, including the activity lifecycle event queue and the input event queue. The following steps should be performed in order to use a looper: Create or obtain a looper associated with the current thread: This is done by the ALooper_prepare function: ALooper* ALooper_prepare(int opts); This function prepares a looper associated with the calling thread and returns it. If the looper doesn't exist, it creates one, associates it with the thread, and returns it Attach an event queue: This is done by ALooper_addFd. The function has the following prototype: int ALooper_addFd(ALooper* looper, int fd, int ident, int events,ALooper_callbackFunc callback, void* data); The function can be used in two ways. Firstly, if callback is set to NULL, the ident set will be returned by ALooper_pollOnce and ALooper_pollAll. Secondly, if callback is non-NULL, then the callback function will be executed and ident is ignored. The android_native_app_glue library uses the first approach to attach a new event queue to the looper. The input argument fd indicates the file descriptor associated with the event queue. ident is the identifier for the events from the event queue, which can be used to classify the event. The identifier must be bigger than zero when callback is set to NULL. callback is set to NULL in the library source code, and data points to the private data that will be returned along with the identifier at polling. In the library, this function is called to attach the activity lifecycle event queue to the background thread. The input event queue is attached using the input queue specific function AInputQueue_attachLooper, which we will discuss in the Detecting and handling input events at NDK recipe. Poll for events: This can be done by either one of the following two functions: int ALooper_pollOnce(int timeoutMillis, int* outFd, int*outEvents, void** outData);int ALooper_pollAll(int timeoutMillis, int* outFd, int* outEvents,void** outData); These two methods are equivalent when callback is set to NULL in ALooper_addFd. They have the same input arguments. timeoutMillis specifies the timeout for polling. If it is set to zero, then the functions return immediately; if it is set to negative, they will wait indefinitely until an event occurs. The functions return the identifier (greater than zero) when an event occurs from any input queues attached to the looper. In this case, outFd, outEvents, and outData will be set to the file descriptor, poll events, and data associated with the event. Otherwise, they will be set to NULL. Detach event queues: This is done by the following function: int ALooper_removeFd(ALooper* looper, int fd); It accepts the looper and file descriptor associated with the event queue, and detaches the queue from the looper.
Read more
  • 0
  • 0
  • 6615
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at €18.99/month. Cancel anytime
article-image-so-what-spring-android
Packt
20 Feb 2013
3 min read
Save for later

So, what is Spring for Android?

Packt
20 Feb 2013
3 min read
(For more resources related to this topic, see here.) RestTemplate The RestTemplate module is a port of the Java-based REST client RestTemplate, which initially appeared in 2009 in Spring for MVC. Like the other Spring template counterparts (JdbcTemplate, JmsTemplate, and so on), its aim is to bring to Java developers (and thus Android developers) a high-level abstraction of lower-level Java API; in this case, it eases the development of HTTP clients. In its Android version, RestTemplate relies on the core Java HTTP facilities (HttpURLConnection) or the Apache HTTP Client. According to the Android device version you use to run your app, RestTemplate for Android can pick the most appropriate one for you. This is according to Android developers' recommendations. See http://android-developers.blogspot.ca/2011/09/androids-http-clients.html. This blog post explains why in certain cases Apache HTTP Client is preferred over HttpURLConnection. RestTemplate for Android also supports gzip compression and different message converters to convert your Java objects from and to JSON, XML, and so on. Auth/Spring Social The goal of the Spring Android Auth module is to let an Android app gain authorization to a web service provider using OAuth (Version 1 or 2). OAuth is probably the most popular authorization protocol (and it is worth mentioning that, it is an open standard) and is currently used by Facebook, Twitter, Google apps (and many others) to let third-party applications access users account. Spring for Android Auth module is based on several Spring libraries because it needs to securely (with cryptography) persist (via JDBC) a token obtained via HTTP; here is a list of the needed libraries for OAuth: Spring Security Crypto: To encrypt the token Spring Android OAuth: This extends Spring Security Crypto adding a dedicated encryptor for Android, and SQLite based persistence provider Spring Android Rest Template: To interact with the HTTP services Spring Social Core: The OAuth workflow abstraction While performing the OAuth workflow, we will also need the browser to take the user to the service provider authentication page, for example, the following is the Twitter OAuth authentication dialog: What Spring for Android is not SpringSource (the company behind Spring for Android) is very famous among Java developers. Their most popular product is the Spring Framework for Java which includes a dependency injection framework (also called an inversion of control framework). Spring for Android does not bring inversion of control to the Android platform. In its very first release (1.0.0.M1), Spring for Android brought a common logging facade for Android; the authors removed it in the next version. Summary In this article, we have learned that Spring for Android helps in easy development of Android applications. We learned the details about the important modules present in it and its functions. We also learnt about dependency injection framework in short and that Spring for Android does not bring inversion of control to the Android platform. Resources for Article : Further resources on this subject: Top 5 Must-have Android Applications [Article] Creating, Compiling, and Deploying Native Projects from the Android NDK [Article] Manifest Assurance: Security and Android Permissions for Flash [Article]
Read more
  • 0
  • 0
  • 3167

article-image-applications-physics
Packt
18 Feb 2013
16 min read
Save for later

Applications of Physics

Packt
18 Feb 2013
16 min read
(For more resources related to this topic, see here.) Introduction to the Box2D physics extension Physics-based games are one of the most popular types of games available for mobile devices. AndEngine allows the creation of physics-based games with the Box2D extension. With this extension, we can construct any type of physically realistic 2D environment from small, simple simulations to complex games. In this recipe, we will create an activity that demonstrates a simple setup for utilizing the Box2D physics engine extension. Furthermore, we will use this activity for the remaining recipes in this article. Getting ready... First, create a new activity class named PhysicsApplication that extends BaseGameActivity and implements IAccelerationListener and IOnSceneTouchListener. How to do it... Follow these steps to build our PhysicsApplication activity class: Create the following variables in the class: public static int cameraWidth = 800; public static int cameraHeight = 480; public Scene mScene; public FixedStepPhysicsWorld mPhysicsWorld; public Body groundWallBody; public Body roofWallBody; public Body leftWallBody; public Body rightWallBody; We need to set up the foundation of our activity. To start doing so, place these four, common overridden methods in the class to set up the engine, resources, and the main scene: @Override public Engine onCreateEngine(final EngineOptions pEngineOptions) { return new FixedStepEngine(pEngineOptions, 60); } @Override public EngineOptions onCreateEngineOptions() { EngineOptions engineOptions = new EngineOptions(true, ScreenOrientation.LANDSCAPE_SENSOR, new FillResolutionPolicy(), new Camera(0,0, cameraWidth, cameraHeight)); engineOptions.getRenderOptions().setDithering(true); engineOptions.getRenderOptions(). getConfigChooserOptions() .setRequestedMultiSampling(true); engineOptions.setWakeLockOptions( WakeLockOptions.SCREEN_ON); return engineOptions; } @Override public void onCreateResources(OnCreateResourcesCallback pOnCreateResourcesCallback) { pOnCreateResourcesCallback. onCreateResourcesFinished(); } @Override public void onCreateScene(OnCreateSceneCallback pOnCreateSceneCallback) { mScene = new Scene(); mScene.setBackground(new Background(0.9f,0.9f,0.9f)); pOnCreateSceneCallback.onCreateSceneFinished(mScene); } Continue setting up the activity by adding the following overridden method, which will be used to populate our scene: @Override public void onPopulateScene(Scene pScene, OnPopulateSceneCallback pOnPopulateSceneCallback) { } Next, we will fill the previous method with the following code to create our PhysicsWorld object and Scene object: mPhysicsWorld = new FixedStepPhysicsWorld(60, new Vector2(0f,-SensorManager.GRAVITY_EARTH*2), false, 8, 3); mScene.registerUpdateHandler(mPhysicsWorld); final FixtureDef WALL_FIXTURE_DEF = PhysicsFactory.createFixtureDef(0, 0.1f, 0.5f); final Rectangle ground = new Rectangle(cameraWidth / 2f, 6f, cameraWidth - 4f, 8f, this.getVertexBufferObjectManager()); final Rectangle roof = new Rectangle(cameraWidth / 2f, cameraHeight – 6f, cameraWidth - 4f, 8f, this.getVertexBufferObjectManager()); final Rectangle left = new Rectangle(6f, cameraHeight / 2f, 8f, cameraHeight - 4f, this.getVertexBufferObjectManager()); final Rectangle right = new Rectangle(cameraWidth - 6f, cameraHeight / 2f, 8f, cameraHeight - 4f, this.getVertexBufferObjectManager()); ground.setColor(0f, 0f, 0f); roof.setColor(0f, 0f, 0f); left.setColor(0f, 0f, 0f); right.setColor(0f, 0f, 0f); groundWallBody = PhysicsFactory.createBoxBody( this.mPhysicsWorld, ground, BodyType.StaticBody, WALL_FIXTURE_DEF); roofWallBody = PhysicsFactory.createBoxBody( this.mPhysicsWorld, roof, BodyType.StaticBody, WALL_FIXTURE_DEF); leftWallBody = PhysicsFactory.createBoxBody( this.mPhysicsWorld, left, BodyType.StaticBody, WALL_FIXTURE_DEF); rightWallBody = PhysicsFactory.createBoxBody( this.mPhysicsWorld, right, BodyType.StaticBody, WALL_FIXTURE_DEF); this.mScene.attachChild(ground); this.mScene.attachChild(roof); this.mScene.attachChild(left); this.mScene.attachChild(right); // Further recipes in this chapter will require us to place code here. mScene.setOnSceneTouchListener(this); pOnPopulateSceneCallback.onPopulateSceneFinished(); The following overridden activities handle the scene touch events, the accelerometer input, and the two engine life cycle events—onResumeGame and onPauseGame. Place them at the end of the class to finish this recipe: @Override public boolean onSceneTouchEvent(Scene pScene, TouchEvent pSceneTouchEvent) { // Further recipes in this chapter will require us to place code here. return true; } @Override public void onAccelerationAccuracyChanged( AccelerationData pAccelerationData) {} @Override public void onAccelerationChanged( AccelerationData pAccelerationData) { final Vector2 gravity = Vector2Pool.obtain( pAccelerationData.getX(), pAccelerationData.getY()); this.mPhysicsWorld.setGravity(gravity); Vector2Pool.recycle(gravity); } @Override public void onResumeGame() { super.onResumeGame(); this.enableAccelerationSensor(this); } @Override public void onPauseGame() { super.onPauseGame(); this.disableAccelerationSensor(); } How it works... The first thing that we do is define a camera width and height. Then, we define a Scene object and a FixedStepPhysicsWorld object in which the physics simulations will take place. The last set of variables defines what will act as the borders for our physics-based scenes. In the second step, we override the onCreateEngine() method to return a FixedStepEngine object that will process 60 updates per second. The reason that we do this, while also using a FixedStepPhysicsWorld object, is to create a simulation that will be consistent across all devices, regardless of how efficiently a device can process the physics simulation. We then create the EngineOptions object with standard preferences, create the onCreateResources() method with only a simple callback, and set the main scene with a light-gray background. In the onPopulateScene() method, we create our FixedStepPhysicsWorld object that has double the gravity of the Earth, passed as an (x,y) coordinate Vector2 object, and will update 60 times per second. The gravity can be set to other values to make our simulations more realistic or 0 to create a zero gravity simulation. A gravity setting of 0 is useful for space simulations or for games that use a top-down camera view instead of a profile. The false Boolean parameter sets the AllowSleep property of the PhysicsWorld object, which tells PhysicsWorld to not let any bodies deactivate themselves after coming to a stop. The last two parameters of the FixedStepPhysicsWorld object tell the physics engine how many times to calculate velocity and position movements. Higher iterations will create simulations that are more accurate, but can cause lag or jitteriness because of the extra load on the processor. After creating the FixedStepPhysicsWorld object, we register it with the main scene as an update handler. The physics world will not run a simulation without being registered. The variable WALL_FIXTURE_DEF is a fixture definition. Fixture definitions hold the shape and material properties of entities that will be created within the physics world as fixtures. The shape of a fixture can be either circular or polygonal. The material of a fixture is defined by its density, elasticity, and friction, all of which are required when creating a fixture definition. Following the creation of the WALL_FIXTURE_DEF variable, we create four rectangles that will represent the locations of the wall bodies. A body in the Box2D physics world is made of fixtures. While only one fixture is necessary to create a body, multiple fixtures can create complex bodies with varying properties. Further along in the onPopulateScene() method, we create the box bodies that will act as our walls in the physics world. The rectangles that were previously created are passed to the bodies to define their position and shape. We then define the bodies as static, which means that they will not react to any forces in the physics simulation. Lastly, we pass the wall fixture definition to the bodies to complete their creation. After creating the bodies, we attach the rectangles to the main scene and set the scene's touch listener to our activity, which will be accessed by the onSceneTouchEvent() method. The final line of the onPopulateScene() method tells the engine that the scene is ready to be shown. The overridden onSceneTouchEvent() method will handle all touch interactions for our scene. The onAccelerationAccuracyChanged() and onAccelerationChanged() methods are inherited from the IAccelerationListener interface and allow us to change the gravity of our physics world when the device is tilted, rotated, or panned. We override onResumeGame() and onPauseGame() to keep the accelerometer from using unnecessary battery power when our game activity is not in the foreground. There's more... In the overridden onAccelerationChanged() method, we make two calls to the Vector2Pool class. The Vector2Pool class simply gives us a way of re-using our Vector2 objects that might otherwise require garbage collection by the system. On newer devices, the Android Garbage Collector has been streamlined to reduce noticeable hiccups, but older devices might still experience lag depending on how much memory the variables being garbage collected occupy. Visit http://www.box2d.org/manual.htmlto see the Box2D User Manual. The AndEngine Box2D extension is based on a Java port of the official Box2D C++ physics engine, so some variations in procedure exist, but the general concepts still apply. See also Understanding different body types in this article. Understanding different body types The Box2D physics world gives us the means to create different body types that allow us to control the physics simulation. We can generate dynamic bodies that react to forces and other bodies, static bodies that do not move, and kinematic bodies that move but are not affected by forces or other bodies. Choosing which type each body will be is vital to producing an accurate physics simulation. In this recipe, we will see how three bodies react to each other during collision, depending on their body types. Getting ready... Follow the recipe in the Introduction to the Box2D physics extension section given at the beginning of this article to create a new activity that will facilitate the creation of our bodies with varying body types. How to do it... Complete the following steps to see how specifying a body type for bodies affects them: First, insert the following fixture definition into the onPopulateScene() method: FixtureDef BoxBodyFixtureDef = PhysicsFactory.createFixtureDef(20f, 0f, 0.5f); Next, place the following code that creates three rectangles and their corresponding bodies after the fixture definition from the previous step: Rectangle staticRectangle = new Rectangle(cameraWidth / 2f,75f,400f,40f,this.getVertexBufferObjectManager()); staticRectangle.setColor(0.8f, 0f, 0f); mScene.attachChild(staticRectangle); PhysicsFactory.createBoxBody(mPhysicsWorld, staticRectangle, BodyType.StaticBody, BoxBodyFixtureDef); Rectangle dynamicRectangle = new Rectangle(400f, 120f, 40f, 40f, this.getVertexBufferObjectManager()); dynamicRectangle.setColor(0f, 0.8f, 0f); mScene.attachChild(dynamicRectangle); Body dynamicBody = PhysicsFactory.createBoxBody(mPhysicsWorld, dynamicRectangle, BodyType.DynamicBody, BoxBodyFixtureDef); mPhysicsWorld.registerPhysicsConnector(new PhysicsConnector( dynamicRectangle, dynamicBody); Rectangle kinematicRectangle = new Rectangle(600f, 100f, 40f, 40f, this.getVertexBufferObjectManager()); kinematicRectangle.setColor(0.8f, 0.8f, 0f); mScene.attachChild(kinematicRectangle); Body kinematicBody = PhysicsFactory.createBoxBody(mPhysicsWorld, kinematicRectangle, BodyType.KinematicBody, BoxBodyFixtureDef); mPhysicsWorld.registerPhysicsConnector(new PhysicsConnector( kinematicRectangle, kinematicBody); Lastly, add the following code after the definitions from the previous step to set the linear and angular velocities for our kinematic body: kinematicBody.setLinearVelocity(-2f, 0f); kinematicBody.setAngularVelocity((float) (-Math.PI)); How it works... In the first step, we create the BoxBodyFixtureDef fixture definition that we will use when creating our bodies in the second step. For more information on fixture definitions, see the Introduction to the Box2D physics extension recipe in this article. In step two, we first define the staticRectangle rectangle by calling the Rectangle constructor. We place staticRectangle at the position of cameraWidth / 2f, 75f, which is near the lower-center of the scene, and we set the rectangle to have a width of 400f and a height of 40f, which makes the rectangle into a long, flat bar. Then, we set the staticRectangle rectangle's color to be red by calling staticRectangle. setColor(0.8f, 0f, 0f). Lastly, for the staticRectangle rectangle, we attach it to the scene by calling the mScene.attachChild() method with staticRectangle as the parameter. Next, we create a body in the physics world that matches our staticRectangle. To do this, we call the PhysicsFactory.createBoxBody() method with the parameters of mPhysicsWorld, which is our physics world, staticRectangle to tell the box to be created with the same position and size as the staticRectangle rectangle, BodyType. StaticBody to define the body as static, and our BoxBodyFixtureDef fixture definition. Our next rectangle, dynamicRectangle, is created at the location of 400f and 120f, which is the middle of the scene slightly above the staticRectangle rectangle. Our dynamicRectangle rectangle's width and height are set to 40f to make it a small square. Then, we set its color to green by calling dynamicRectangle.setColor(0f, 0.8f, 0f) and attach it to our scene using mScene.attachChild(dynamicRectangle). Next, we create the dynamicBody variable using the PhysicsFactory.createBoxBody() method in the same way that we did for our staticRectangle rectangle. Notice that we set the dynamicBody variable to have BodyType of DynamicBody. This sets the body to be dynamic. Now, we register PhysicsConnector with the physics world to link dynamicRectangle and dynamicBody. A PhysicsConnecter class links an entity within our scene to a body in the physics world, representing the body's realtime position and rotation in our scene. Our last rectangle, kinematicRectangle, is created at the location of 600f and 100f, which places it on top of our staticRectangle rectangle toward the right-hand side of the scene. It is set to have a height and width of 40f, which makes it a small square like our dynamicRectangle rectangle. We then set the kinematicRectangle rectangle's color to yellow and attach it to our scene. Similar to the previous two bodies that we created, we call the PhysicsFactory.createBoxBody() method to create our kinematicBody variable. Take note that we create our kinematicBody variable with a BodyType type of KinematicBody. This sets it to be kinematic and thus moved only by the setting of its velocities. Lastly, we register a PhysicsConnector class between our kinematicRectangle rectangle and our kinematicBody body type. In the last step, we set our kinematicBody body's linear velocity by calling the setLinearVelocity() method with a vector of -2f on the x axis, which makes it move to the left. Finally, we set our kinematicBody body's angular velocity to negative pi by calling kinematicBody.setAngularVelocity((float) (-Math.PI)). For more information on setting a body's velocities, see the Using forces, velocities, and torque recipe in this article. There's more... Static bodies cannot move from applied or set forces, but can be relocated using the setTransform() method. However, we should avoid using the setTransform() method while a simulation is running, because it makes the simulation unstable and can cause some strange behaviors. Instead, if we want to change the position of a static body, we can do so whenever creating the simulation or, if we need to change the position at runtime, simply check that the new position will not cause the static body to overlap existing dynamic bodies or kinematic bodies. Kinematic bodies cannot have forces applied, but we can set their velocities via the setLinearVelocity() and setAngularVelocity() methods. See also Introduction to the Box2D physics extension in this article. Using forces, velocities, and torque in this article. Creating category-filtered bodies Depending on the type of physics simulation that we want to achieve, controlling which bodies are capable of colliding can be very beneficial. In Box2D, we can assign a category, and category-filter to fixtures to control which fixtures can interact. This recipe will cover the defining of two category-filtered fixtures that will be applied to bodies created by touching the scene to demonstrate category-filtering. Getting ready... Create an activity by following the steps in the Introduction to the Box2D physics extension section given at the beginning of the article. This activity will facilitate the creation of the category-filtered bodies used in this section. How to do it... Follow these steps to build our category-filtering demonstration activity: Define the following class-level variables within the activity: private int mBodyCount = 0; public static final short CATEGORYBIT_DEFAULT = 1; public static final short CATEGORYBIT_RED_BOX = 2; public static final short CATEGORYBIT_GREEN_BOX = 4; public static final short MASKBITS_RED_BOX = CATEGORYBIT_DEFAULT + CATEGORYBIT_RED_BOX; public static final short MASKBITS_GREEN_BOX = CATEGORYBIT_DEFAULT + CATEGORYBIT_GREEN_BOX; public static final FixtureDef RED_BOX_FIXTURE_DEF = PhysicsFactory.createFixtureDef(1, 0.5f, 0.5f, false, CATEGORYBIT_RED_BOX, MASKBITS_RED_BOX, (short)0); public static final FixtureDef GREEN_BOX_FIXTURE_DEF = PhysicsFactory.createFixtureDef(1, 0.5f, 0.5f, false, CATEGORYBIT_GREEN_BOX, MASKBITS_GREEN_BOX, (short)0); Next, create this method within the class that generates new category-filtered bodies at a given location: private void addBody(final float pX, final float pY) { this.mBodyCount++; final Rectangle rectangle = new Rectangle(pX, pY, 50f, 50f, this.getVertexBufferObjectManager()); rectangle.setAlpha(0.5f); final Body body; if(this.mBodyCount % 2 == 0) { rectangle.setColor(1f, 0f, 0f); body = PhysicsFactory.createBoxBody(this.mPhysicsWorld, rectangle, BodyType.DynamicBody, RED_FIXTURE_DEF); } else { rectangle.setColor(0f, 1f, 0f); body = PhysicsFactory.createBoxBody(this.mPhysicsWorld, rectangle, BodyType.DynamicBody, GREEN_FIXTURE_DEF); } this.mScene.attachChild(rectangle); this.mPhysicsWorld.registerPhysicsConnector(new PhysicsConnector( rectangle, body, true, true)); } Lastly, fill the body of the onSceneTouchEvent() method with the following code that calls the addBody() method by passing the touched location: if(this.mPhysicsWorld != null) if(pSceneTouchEvent.isActionDown()) this.addBody(pSceneTouchEvent.getX(), pSceneTouchEvent.getY()); How it works... In the first step, we create an integer, mBodyCount, which counts how many bodies we have added to the physics world. The mBodyCount integer is used in the second step to determine which color, and thus which category, should be assigned to the new body. We also create the CATEGORYBIT_DEFAULT, CATEGORYBIT_RED_BOX, and CATEGORYBIT_ GREEN_BOX category bits by defining them with unique power-of-two short integers and the MASKBITS_RED_BOX and MASKBITS_GREEN_BOX mask bits by adding their associated category bits together. The category bits are used to assign a category to a fixture, while the mask bits combine the different category bits to determine which categories a fixture can collide with. We then pass the category bits and mask bits to the fixture definitions to create fixtures that have category collision rules. The second step is a simple method that creates a rectangle and its corresponding body. The method takes the X and Y location parameters that we want to use to create a new body and passes them to a Rectangle object's constructor, to which we also pass a height and width of 50f and the activity's VertexBufferObjectManager. Then, we set the rectangle to be 50 percent transparent using the rectangle.setAlpha() method. After that, we define a body and modulate the mBodyCount variable by 2 to determine the color and fixture of every other created body. After determining the color and fixture, we assign them by setting the rectangle's color and creating a body by passing our mPhysicsWorld physics world, the rectangle, a dynamic body type, and the previously-determined fixture to use. Finally, we attach the rectangle to our scene and register a PhysicsConnector class to connect the rectangle to our body. The third step calls the addBody() method from step two only if the physics world has been created and only if the scene's TouchEvent is ActionDown. The parameters that are passed, pSceneTouchEvent.getX() and pSceneTouchEvent.getY(), represent the location on the scene that received a touch input, which is also the location where we want to create a new category-filtered body. There's more... The default category of all fixtures has a value of one. When creating mask bits for specific fixtures, remember that any combination that includes the default category will cause the fixture to collide with all other fixtures that are not masked to avoid collision with the fixture. See also Introduction to the Box2D physics extension in this article. Understanding different body types in this article.
Read more
  • 0
  • 0
  • 1161

article-image-new-connectivity-apis-android-beam
Packt
18 Jan 2013
7 min read
Save for later

New Connectivity APIs – Android Beam

Packt
18 Jan 2013
7 min read
(For more resources related to this topic, see here.) Android Beam Devices that have NFC hardware can share data by tapping them together. This could be done with the help of the Android Beam feature. It is similar to Bluetooth, as we get seamless discovery and pairing as in a Bluetooth connection. Devices connect when they are close to each other (not more than a few centimeters). Users can share pictures, videos, contacts, and so on, using the Android Beam feature. Beaming NdefMessages In this section, we are going to implement a simple Android Beam application. This application will send an image to another device when two devices are tapped together. There are three methods that are introduced with Android Ice Cream Sandwich that are used in sending NdefMessages. These methods are as follows: setNdefPushMessage() : This method takes an NdefMessage as a parameter and sends it to another device automatically when devices are tapped together. This is commonly used when the message is static and doesn't change. setNdefPushMessageCallback() : This method is used for creating dynamic NdefMessages. When two devices are tapped together, the createNdefMessage() method is called. setOnNdefPushCompleteCallback() : This method sets a callback which is called when the Android Beam is successful. We are going to use the second method in our sample application. Our sample application's user interface will contain a TextView component for displaying text messages and an ImageView component for displaying the received images sent from another device. The layout XML code will be as follows: <RelativeLayout android_layout_width="match_parent" android_layout_height="match_parent" > <TextView android_id="@+id/textView" android_layout_width="wrap_content" android_layout_height="wrap_content" android_layout_centerHorizontal="true" android_layout_centerVertical="true" android_text="" /> <ImageView android_id="@+id/imageView" android_layout_width="wrap_content" android_layout_height="wrap_content" android_layout_below="@+id/textView" android_layout_centerHorizontal="true" android_layout_marginTop="14dp" /> </RelativeLayout> Now, we are going to implement, step-by-step, the Activity class of the sample application. The code of the Activity class with the onCreate() method is as follows: public class Chapter9Activity extends Activity implementsCreateNdefMessageCallback{NfcAdapter mNfcAdapter;TextView mInfoText;ImageView imageView;@Overridepublic void onCreate(Bundle savedInstanceState) {super.onCreate(savedInstanceState);setContentView(R.layout.main);imageView = (ImageView) findViewById(R.id.imageView);mInfoText = (TextView) findViewById(R.id.textView);// Check for available NFC AdaptermNfcAdapter =NfcAdapter.getDefaultAdapter(getApplicationContext());if (mNfcAdapter == null){mInfoText = (TextView) findViewById(R.id.textView);mInfoText.setText("NFC is not available on this device.");finish();return;}// Register callback to set NDEF messagemNfcAdapter.setNdefPushMessageCallback(this, this);}@Overridepublic boolean onCreateOptionsMenu(Menu menu) {getMenuInflater().inflate(R.menu.main, menu);return true;}} As you can see in this code, we can check whether the device provides an NfcAdapter. If it does, we get an instance of NfcAdapter. Then, we call the setNdefPushMessageCallback() method to set the callback using the NfcAdapter instance. We send the Activity class as a callback parameter because the Activity class implements CreateNdefMessageCallback.In order to implement CreateNdefMessageCallback, we should override the createNdefMessage()method as shown in the following code block: @Overridepublic NdefMessage createNdefMessage(NfcEvent arg0) {Bitmap icon =BitmapFactory.decodeResource(this.getResources(),R.drawable.ic_launcher);ByteArrayOutputStream stream = new ByteArrayOutputStream();icon.compress(Bitmap.CompressFormat.PNG, 100, stream);byte[] byteArray = stream.toByteArray();NdefMessage msg = new NdefMessage(new NdefRecord[] {createMimeRecord("application/com.chapter9", byteArray), NdefRecord.createApplicationRecord("com.chapter9")});return msg;}public NdefRecord createMimeRecord(String mimeType, byte[]payload) {byte[] mimeBytes = mimeType.getBytes(Charset.forName("USASCII"));NdefRecord mimeRecord = newNdefRecord(NdefRecord.TNF_MIME_MEDIA,mimeBytes, new byte[0], payload);return mimeRecord;} As you can see in this code, we get a drawable, convert it to bitmap, and then to a byte array. Then we create an NdefMessage with two NdefRecords. The first record contains the mime type and the byte array. The first record is created by the createMimeRecord() method. The second record contains the Android Application Record ( AAR). The Android Application Record was introduced with Android Ice Cream Sandwich. This record contains the package name of the application and increases the certainty that your application will start when an NFC Tag is scanned. That is, the system firstly tries to match the intent filter and AAR together to start the activity. If they don't match, the activity that matches the AAR is started. When the activity is started by an Android Beam event, we need to handle the message that is sent by the Android Beam. We handle this message in the onResume() method of the Activity class as shown in the following code block: @Overridepublic void onResume() {super.onResume();// Check to see that the Activity started due to an AndroidBeamif (NfcAdapter.ACTION_NDEF_DISCOVERED.equals(getIntent().getAction())) {processIntent(getIntent());}}@Overridepublic void onNewIntent(Intent intent) {// onResume gets called after this to handle the intentsetIntent(intent);}void processIntent(Intent intent) {Parcelable[] rawMsgs = intent.getParcelableArrayExtra(NfcAdapter.EXTRA_NDEF_MESSAGES);// only one message sent during the beamNdefMessage msg = (NdefMessage) rawMsgs[0];// record 0 contains the MIME type, record 1 is the AARbyte[] bytes = msg.getRecords()[0].getPayload();Bitmap bmp = BitmapFactory.decodeByteArray(bytes, 0,bytes.length);imageView.setImageBitmap(bmp);} As you can see in this code, we firstly check whether the intent is ACTION_NDEF_DISCOVERED. This means the Activity class is started due to an Android Beam. If it is started due to an Android Beam, we process the intent with the processIntent() method. We firstly get NdefMessage from the intent. Then we get the first record and convert the byte array in the first record to bitmap using BitmapFactory . Remember that the second record is AAR, we do nothing with it. Finally, we set the bitmap of the ImageView component. The AndroidManifest.xml file of the application should be as follows: <manifest package="com.chapter9"android:versionCode="1"android:versionName="1.0" ><uses-permission android_name="android.permission.NFC"/><uses-feature android_name="android.hardware.nfc"android:required="false" /><uses-sdkandroid:minSdkVersion="14"android:targetSdkVersion="15" /><applicationandroid:icon="@drawable/ic_launcher"android:label="@string/app_name"android:theme="@style/AppTheme" ><activityandroid:name=".Chapter9Activity"android:label="@string/title_activity_chapter9" ><intent-filter><action android_name="android.intent.action.MAIN" /><categoryandroid:name="android.intent.category.LAUNCHER" /></intent-filter><intent-filter><actionandroid:name="android.nfc.action.NDEF_DISCOVERED" /><categoryandroid:name="android.intent.category.DEFAULT" /><data android_mimeType="application/com.chapter9" /></intent-filter></activity></application></manifest> As you can see in this code, we need to set the minimum SDK to API Level 14 or more in the AndroidManifest.xml file because these APIs are available in API Level 14 or more. Furthermore, we need to set the permissions to use NFC. We also set the uses feature in AndroidManifest.xml. The feature is set as not required. This means that our application would be available for devices that don't have NFC support. Finally, we create an intent filter for android.nfc.action.NDEF_DISCOVERED with mimeType of application/com.chapter9. When a device sends an image using our sample application, the screen will be as follows: Summary In this article, we firstly learned the Android Beam feature of Android. With this feature, devices can send data using the NFC hardware. We implemented a sample Android Beam application and learned how to use Android Beam APIs. Resources for Article : Further resources on this subject: Android 3.0 Application Development: Multimedia Management [Article] Animating Properties and Tweening Pages in Android 3-0 [Article] Android User Interface Development: Animating Widgets and Layouts [Article]
Read more
  • 0
  • 0
  • 1582

article-image-creating-compiling-and-deploying-native-projects-android-ndk
Packt
13 Feb 2012
13 min read
Save for later

Creating, Compiling, and Deploying Native Projects from the Android NDK

Packt
13 Feb 2012
13 min read
(For more resources on Android, see here.) Compiling and deploying NDK sample applications I guess you cannot wait anymore to test your new development environment. So why not compile and deploy elementary samples provided by the Android NDK first to see it in action? To get started, I propose to run HelloJni, a sample application which retrieves a character string defined inside a native C library into a Java activity (an activity in Android being more or less equivalent to an application screen). Time for action – compiling and deploying hellojni sample Let's compile and deploy HelloJni project from command line using Ant: Open a command-line prompt (or Cygwin prompt on Windows). Go to hello-jni sample directory inside the Android NDK. All the following steps have to performed from this directory: $ cd $ANDROID_NDK/samples/hello-jni Create Ant build file and all related configuration files automatically using android command (android.bat on Windows). These files describe how to compile and package an Android application: android update project –p . (Move the mouse over the image to enlarge.) Build libhello-jni native library with ndk-build, which is a wrapper Bash script around Make. Command ndk-build sets up the compilation toolchain for native C/ C++ code and calls automatically GCC version featured with the NDK. $ ndk-build Make sure your Android development device or emulator is connected and running. Compile, package, and install the final HelloJni APK (an Android application package). All these steps can be performed in one command, thanks to Ant build automation tool. Among other things, Ant runs javac to compile Java code, AAPT to package the application with its resources, and finally ADB to deploy it on the development device. Following is only a partial extract of the output: $ ant install The result should look like the following extract: Launch a shell session using adb (or adb.exe on Windows). ADB shell is similar to shells that can be found on the Linux systems: $ adb shell From this shell, launch HelloJni application on your device or emulator. To do so, use am, the Android Activity Manager. Command am allows to start Android activities, services or sending intents (that is, inter-activity messages) from command line. Command parameters come from the Android manifest: # am start -a android.intent.action.MAIN -n com.example.hellojni/com.example.hellojni.HelloJni Finally, look at your development device. HelloJni appears on the screen! What just happened? We have compiled, packaged, and deployed an official NDK sample application with Ant and SDK command-line tools. We will explore them more in later part. We have also compiled our first native C library (also called module) using the ndk-build command. This library simply returns a character string to the Java part of the application on request. Both sides of the application, the native and the Java one, communicate through Java Native Interface. JNI is a standard framework that allows Java code to explicitly call native C/C++ code with a dedicated API. Finally, we have launched HelloJni on our device from an Android shell (adb shell) with the am Activity Manager command. Command parameters passed in step 8 come from the Android manifest: com.example.hellojni is the package name and com.example.hellojni. HelloJni is the main Activity class name concatenated to the main package. <?xml version="1.0" encoding="utf-8"?><manifest package="com.example.hellojni" HIGHLIGHT android_versionCode="1" android_versionName="1.0">... <activity android_name=".HelloJni" HIGHLIGHT android_label="@string/app_name">... Automated build Because Android SDK, NDK, and their open source bricks are not bound to Eclipse or any specific IDE, creating an automated build chain or setting up a continuous integration server becomes possible. A simple bash script with Ant is enough to make it work! HelloJni sample is a little bit... let's say rustic! So what about trying something fancier? Android NDK provides a sample named San Angeles. San Angeles is a coding demo created in 2004 for the Assembly 2004 competition. It has been later ported to OpenGL ES and reused as a sample demonstration in several languages and systems, including Android. You can find more information by visiting one of the author's page: http://jet.ro/visuals/4k-intros/san-angeles-observation/. Have a go hero – compiling san angeles OpenGL demo To test this demo, you need to follow the same steps: Go to the San Angeles sample directory. Generate project files. Compile and install the final San Angeles application. Finally run it. As this application uses OpenGL ES 1, AVD emulation will work, but may be somewhat slow! You may encounter some errors while compiling the application with Ant: The reason is simple: in res/layout/ directory, main.xml file is defined. This file usually defines the main screen layout in Java application—displayed components and how they are organized. However, when Android 2.2 (API Level 8) was released, the layout_width and layout_height enumerations, which describe the way UI components should be sized, were modified: FILL_PARENT became MATCH_PARENT. But San Angeles uses API Level 4. There are basically two ways to overcome this problem. The first one is selecting the right Android version as the target. To do so, specify the target when creating Ant project files: $ android update project –p . -–target android-8 This way, build target is set to API Level 8 and MATCH_PARENT is recognized. You can also change the build target manually by editing default.properties at the project root and replacing: target=android-4 with the following line: target=android-8 The second way is more straightforward: erase the main.xml file! Indeed, this file is in fact not used by San Angeles demo, as only an OpenGL screen created programmatically is displayed, without any UI components. Target right! When compiling an Android application, always check carefully if you are using the right target platform, as some features are added or updated between Android versions. A target can also dramatically change your audience wideness because of the multiple versions of Android in the wild... Indeed, targets are moving a lot and fast on Android!. All these efforts are not in vain: it is just a pleasure to see this old-school 3D environment full of flat-shaded polygons running for the first time. So just stop reading and run it! Exploring android SDK tools Android SDK includes tools which are quite useful for developers and integrators. We have already overlooked some of them including the Android Debug Bridge and android command. Let's explore them deeper.   Android debug bridge You may have not noticed it specifically since the beginning but it has always been there, over your shoulder. The Android Debug Bridge is a multifaceted tool used as an intermediary between development environment and emulators/devices. More specifically, ADB is: A background process running on emulators and devices to receive orders or requests from an external computer. A background server on your development computer communicating with connected devices and emulators. When listing devices, ADB server is involved. When debugging, ADB server is involved. When any communication with a device happens, ADB server is involved! A client running on your development computer and communicating with devices through ADB server. That is what we have done to launch HelloJni: we got connected to our device using adb shell before issuing the required commands. ADB shell is a real Linux shell embedded in ADB client. Although not all standard commands are available, classical commands, such as ls, cd, pwd, cat, chmod, ps, and so on are executable. A few specific commands are also provided such as: logcat To display device log messages dumpsys To dump system state dmesg To dump kernel messages ADB shell is a real Swiss Army knife. It also allows manipulating your device in a flexible way, especially with root access. For example, it becomes possible to observe applications deployed in their "sandbox" (see directory /data/data) or to a list and kill currently running processes. ADB also offers other interesting options; some of them are as follows: pull <device path> <local path> To transfer a file to your computer push <local path> <device path> To transfer a file to your device or emulator install <application package> To install an application package install -r <package to reinstall> To reinstall an application, if already deployed devices To list all Android devices currently connected, including emulators reboot To restart an Android device programmatically wait-for-device To sleep, until a device or emulator is connected to your computer (for example,. in a script) start-server To launch the ADB server communicating with devices and emulators kill-server To terminate the ADB server bugreport To print the whole device state (like dumpsys) help To get an exhaustive help with all options and flags available To ease the writing of issued command, ADB provides facultative flags to specify before options: -s <device id> To target a specific device -d To target current physical device, if only one is connected (or an error message is raised) -e To target currently running emulator, if only one is connected (or an error message is raised) ADB client and its shell can be used for advanced manipulation on the system, but most of the time, it will not be necessary. ADB itself is generally used transparently. In addition, without root access to your phone, possible actions are limited. For more information, see http://developer.android.com/guide/developing/tools/adb.html. Root or not root. If you know the Android ecosystem a bit, you may have heard about rooted phones and non-rooted phones. Rooting a phone means getting root access to it, either "officially" while using development phones or using hacks with an end user phone. The main interest is to upgrade your system before the manufacturer provides updates (if any!) or to use a custom version (optimized or modified, for example, CyanogenMod). You can also do any possible (especially dangerous) manipulations that an Administrator can do (for example, deploying a custom kernel). Rooting is not an illegal operation, as you are modifying YOUR device. But not all manufacturers appreciate this practice and usually void the warranty. Have a go hero – transferring a file to SD card from command line Using the information provided, you should be able to connect to your phone like in the good old days of computers (I mean a few years ago!) and execute some basic manipulation using a shell prompt. I propose you to transfer a resource file by hand, like a music clip or a resource that you will be reading from a future program of yours. To do so, you need to open a command-line prompt and perform the following steps: Check if your device is available using adb from command line. Connect to your device using the Android Debug Bridge shell prompt. Check the content of your SD card using standard Unix ls command. Please note that ls on Android has a specific behavior as it differentiates ls mydir from ls mydir/, when mydir is a symbolic link. Create a new directory on your SD card using the classic command mkdir . Finally, transfer your file by issuing the appropriate adb command. Project configuration tool The command named android is the main entry point when manipulating not only projects but also AVDs and SDK updates. There are few options available, which are as follows: create project: This option is used to create a new Android project through command line. A few additional options must be specified to allow proper generation: -p The project path -n The project name -t The Android API target -k The Java package, which contains application's main class -a The application's main class name (Activity in Android terms) For example: $ android create project –p ./MyProjectDir –n MyProject –t android-8 –k com.mypackage –a MyActivity update project: This is what we use to create Ant project files from an existing source. It can also be used to upgrade an existing project to a new version. Main parameters are as follows: -p The project path -n To change the project name -l To include an Android library project (that is, reusable code). The path must be relative to the project directory). -t To change the Android API target There are also options to create library projects (create lib-project, update lib- project) and test projects (create test-project, update test-project). I will not go into details here as this is more related to the Java world. As for ADB, android command is your friend and can give you some help: $ android create project –help   Command android is a crucial tool to implement a continuous integration toolchain in order to compile, package, deploy, and test a project automatically entirely from command line. Have a go hero – towards continuous integration With adb, android, and ant commands, you have enough knowledge to build a minimal automatic compilation and deployment script to perform some continuous integration. I assume here that you have a versioning software available and you know how to use it. Subversion (also known as SVN) is a good candidate and can work in local (without a server). Perform the following operations: Create a new project by hand using android command. Then, create a Unix or Cygwin shell script and assign it the necessary execution rights (chmod command). All the following steps have to be scribbled in it. In the script, check out sources from your versioning system (for example, using a svn checkout command) on disk. If you do not have a versioning system, you can still copy your own project directory using Unix commands. Build the application using ant. Do not forget to check command results using $?. If the returned value is different from 0, it means an error occurred. Additionally, you can use grep or some custom tools to check potential error messages. If needed, you can deploy resources files using adb Install it on your device or on the emulator (which you can launch from the script) using ant as shown previously. You can even try to launch your application automatically and check Android logs (see logcat option in adb). Of course, your application needs to make use of logs! A free monkey to test your App! In order to automate UI testing on an Android application, an interesting utility that is provided with the Android SDK is MonkeyRunner, which can simulate user actions on a device to perform some automated UI testing. Have a look at http://developer.android.com/guide/developing/tools/monkeyrunner_concepts.html . To favor automation, a single Android shell statement can be executed from command-line as follows: adb shell ls /sdcard/   To execute a command on an Android device and retrieve its result back on your host shell, execute the following command: adb shell "ls / notexistingdir/ 1> /dev/null 2> &1; echo $?" Redirection is necessary to avoid polluting the standard output. The escape character before $? is required to avoid early interpretation by the host shell. Now you are fully prepared to automate your own build toolchain!
Read more
  • 0
  • 0
  • 4811
article-image-android-30-application-development-multimedia-management
Packt
08 Aug 2011
6 min read
Save for later

Android 3.0 Application Development: Multimedia Management

Packt
08 Aug 2011
6 min read
Android 3.0 Application Development Cookbook Over 70 working recipes covering every aspect of Android development Very few successful applications are completely silent or have only static graphics, and in order that Android developers take full advantage of the advanced multimedia capabilities of today's smartphones, the system provides the android.media package, which contains many useful classes. The MediaPlayer class allows the playback of both audio and video from raw resources, files, and network streams, and the MediaRecorder class makes it possible to record both sound and images. Android also offers ways to manipulate sounds and create interactive effects through the use of the SoundPool class, which allows us to not only bend the pitch of our sounds but also to play more than one at a time.   Playing an audio file from within an application One of the first things that we may want to do with regards to multimedia is play back an audio file. Android provides the android.media.MediaPlayer class for us and this makes playback and most media related functions remarkably simple. In this recipe we will create a simple media player that will play a single audio file. Getting ready Before we start this project we will need an audio file for playback. Android can decode audio with any of the following file extensions: .3GP .MP4 .M4A .MP3 .OGG .WAV There are also quite a few MIDI file formats that are acceptable but have not been included here as their use is less common and their availability often depends on whether a device is running the standard Android platform or a specific vendor extension. Before you start this exercise create or find a short sound sample in one of the given formats. We used a five second Ogg Vorbis file and called it my_sound_file.ogg. How to do it... Start up a new Android project in Eclipse and create a new folder: res/raw. Place the sound file that you just prepared in this folder. In this example we refer to it as my_sound_file. Using either the Graphical Layout or the main.xml panel edit the file res/layout/main.xml to contain three buttons, as seen in the following screenshot: Call these buttons play_button, pause_button and stop_button. In the Java activity code declare a MediaPlayer in the onCreate() method: @Override public void onCreate(Bundle state) { super.onCreate(state); setContentView(R.layout.main); final MediaPlayer mPlayer; Associate the buttons we added in step 3 with Java variables by adding the following lines to onCreate(): Button playButton = (Button) findViewById(R.id.play_button); Button pauseButton = (Button) findViewById(R.id.pause_button); Button stopButton = (Button) findViewById(R.id.stop_button); We need a click listener for our play button. This also can be defined from within onCreate(): playButton.setOnClickListener(new OnClickListener() { public void onClick(View v) { mPlayer = MediaPlayer.create(this, R.raw.my_sound_file); mPlayer.setLooping(true); mPlayer.start(); } }); Next add a listener for the pause button as follows: pauseButton.setOnClickListener(new OnClickListener() { public void onClick(View v) { mPlayer.pause(); } }); Finally, include a listener for the stop button: stopButton.setOnClickListener(new OnClickListener() { public void onClick(View v) { mPlayer.stop(); mPlayer.reset(); } }); Now run this code on an emulator or your handset and test each of the buttons. How it works... The MediaPlayer class provides some useful functions and the use of start(), pause(), stop(), and setLooping() should be clear. However, if you are thinking that calling MediaPlayer.create(context, ID) every time the start button is pressed is overkill, you would be correct. This is because once stop() has been called on the MediaPlayer, the media needs to be reset and prepared (with reset() and prepare()) before start() can be called again. Fortunately MediaPlayer.create() also calls prepare() so that the first time we play an audio file we do not have to worry about this. The lifecycle of the MediaPlayer is not always straightforward and the order in which it takes on various states is best explained diagrammatically: Otherwise, MediaPlayer has lots of useful methods such as isPlaying(), which will return a Boolean telling us whether our file is being played or not, or getDuration() and getCurrentPosition(), which inform us of how long the sample is and how far through it we are. There are also some useful hooks that we can employ using MediaPlayer and the most commonly used are onCompletionListener() and onErrorListener(). There's more... We are not restricted to playing back raw resources. We can also playback local files or even stream audio. Playing back a file or a stream Use the MediaPlayer.setDataSource(String) method to play an audio file or stream. In the case of streaming audio this will need to be a URL representing a media file that is capable of being played progressively, and you will need to prepare the media player each time it runs: MediaPlayer player = new MediaPlayer(); player.setDataSource("string value of your file path or URL"); player.prepare(); player.start(); It is essential to surround setDataSource() with a try/catch clause in case the source does not exist when dealing with removable or online media.   Playing back video from external memory The MediaPlayer class that we met in the previous recipe works for video in the same manner that it does for audio and so as not to make this task a near copy of the last, here we will look at how to play back video files stored on an SD card using the VideoView object. Getting ready This recipe requires a video file for our application to playback. Android can decode H.263, H.264 and MPEG-4 files; generally speaking this means files with .3gp and .mp4 file extensions. For platforms since 3.0 (API level 11) it is also possible to manage H.264 AVC files. Find a short video clip in one of these compatible formats and save it on the SD card of your handset. Alternatively you can create an emulator with an SD card enabled and push your video file onto it. This can be done easily through Eclipse's DDMS perspective from the File Explorer tab: In this example we called our video file my_video.3gp.  
Read more
  • 0
  • 0
  • 2135

article-image-android-30-application-development-managing-menus
Packt
26 Jul 2011
7 min read
Save for later

Android 3.0 Application Development: Managing Menus

Packt
26 Jul 2011
7 min read
Android 3.0 Application Development Cookbook All Android handsets have a hard menu key for calling up secondary choices that do not need to be made available from the main screen or perhaps need to be made available across an application. In concord with Android's philosophy of separating appearance from function, menus are generally created in the same way as other visual elements, that is, with the use of a definitive XML layout file. There is a lot that can be done to control menus dynamically and Android provides classes and interfaces for displaying context-sensitive menus, organizing menu items into groups, and including shortcuts. Creating and inflating an options menu To keep our application code separate from our menu layout information, Android uses a designated resource folder (res/menu) and an XML layout file to define the physical appearance of our menu; such as the titles and icons we see in Android pop-up menus. The Activity class contains a callback method, onCreateOptionsMenu(), that can be overridden to inflate a menu. Getting ready Android menus are defined in a specific, designated folder. Eclipse does not create this folder by default so start up a new project and add a new folder inside the res folder and call it menu. How to do it... Create a new XML file in our new res/menu folder and call it my_menu.xml. Complete the new file as follows: <?xml version="1.0" encoding="utf-8"?> <menu > <item android_id="@+id/item_one" android_title="first item" /> <item android_id="@+id/item_two" android_title="second item" /> </menu> In the Java application file, include the following overridden callback: @Override public boolean onCreateOptionsMenu(Menu menu) { MenuInflater inflater = getMenuInflater(); inflater.inflate(R.menu.my_menu, menu); return true; } Run the application on a handset or emulator and press the hard menu key to view the menu: How it works... Whenever we create an Android menu using XML we must place it in the folder we used here (res/menu). Likewise, the base node of our XML structure must be <menu>. The purpose of the id element should be self explanatory and the title attribute is used to set the text that the user sees when the menu item is inflated. The MenuInflater object is a straightforward way of turning an XML layout file into a Java object. We create a MenuInflater with getMenuInflater() which returns a MenuInflater from the current activity, of which it is a member. The inflate() call takes both the XML file and the equivalent Java object as its parameters. There's more... The type of menu we created here is referred to as an options menu and it comes in two flavors depending on how many items it contains. There is also a neater way to handle item titles when they are too long to be completely displayed. Handling longer options menus When an options menu has six or fewer items it appears as a block of items at the bottom of the screen. This is called the icon menu and is, as its name suggests, the only menu type capable of displaying icons. On tablets running API level 11 or greater the Action bar can also be used to access the menu. The icon menu is also the only menu type that cannot display radio buttons or check marks. When an inflated options menu has more than six items, the sixth place on the icon menu is replaced by the system's own More item, which when pressed calls up the extended menu which displays all items from the sixth onwards, adding a scroll bar if necessary. Providing condensed menu titles If Android cannot fit an item's title text into the space provided (often as little as one third of the screen width) it will simply truncate it. To provide a more readable alternative, include the android:titleCondensed="string" attribute alongside android:title in the item definition. Adding Option menu items to the Action Bar For tablet devices targeting Android 3.0 or greater, option menu items can be added to the Action Bar. Adjust the target build of the above project to API level 11 or above and replace the res/menu/my_menu.xml file with the following: <?xml version="1.0" encoding="utf-8"?> <menu > <item android_id="@+id/item_one" android_title="first item" android_icon="@drawable/icon" android_showAsAction="ifRoom" /> <item android_id="@+id/item_two" android_title="second item" android_icon="@drawable/icon" android_showAsAction="ifRoom|withText" /> <item android_id="@+id/item_three" android_title="third item" android_icon="@drawable/icon" android_showAsAction="always" /> <item android_id="@+id/item_four" android_title="fourth item" android_icon="@drawable/icon" android_showAsAction="never" /> </menu> Note from the output that unless the withText flag is included, the menu item will display only as an icon: Designing Android compliant menu icons The menu items we defined in the previous recipe had only text titles to identify them to the user, however nearly all Icon Menus that we see on Android devices combine a text title with an icon. Although it is perfectly possible to use any graphic image as a menu icon, using images that do not conform to Android's own guidelines on icon design is strongly discouraged, and Android's own development team are particularly insistent that only the subscribed color palette and effects are used. This is so that these built-in menus which are universal across Android applications provide a continuous experience for the user. Here we examine the colors and dimensions prescribed and also examine how to provide the subsequent images as system resources in such a way as to cater for a variety of screen densities. Getting ready The little application we put together in the last recipe makes a good starting point for this one. Most of the information here is to do with design of the icons, so you may want to have a graphics editor such as GIMP or PhotoShop open, or you may want to refer back here later for the exact dimensions and palettes. How to do it... Open the res/menu/my_menu.xml file and add the android:icon elements seen here to each item: <?xml version="1.0" encoding="utf-8"?> <menu > <item android_id="@+id/item_one" android_icon="@drawable/my_menu_icon" android_title="first item" /> <item android_id="@+id/item_two" android_icon="@drawable/my_menu_icon" android_title="second item" /> </menu> With your graphics editor, create a new transparent PNG file, precisely 48 by 48 pixels in dimension. Ensuring that there is at least a 6 pixel border all the way around, produce your icon as a simple two-dimensional flat shape. Something like this: Fill the shape with a grayscale gradient that ranges from 47% to 64% (white) with the lighter end at the top. Provide a black inner shadow with the following settings: 20% opaque 90° angle (top to bottom) 2 pixel width 2 pixel distance Next, add an inner bevel with: Depth of 1% 90° altitude 70% opaque, white highlight 25% opaque, black shadow Now give the graphic a white outer glow with: 55% opacity 3 pixel size 10% spread Make two copies of our graphic, one resized to 36 by 36 pixels and one 72 by 72 pixels. Save the largest file in the res/drawable-hdpi as my_menu_icon.png. Save the 48 by 48 pixel file with the same name in the drawable-mdpi folder and the smallest image in drawable-ldpi. To see the full effect of these three files in action you will need to run the software on handsets with different screen resolutions or construct emulators to that purpose. How it works... As already mentioned, Android currently insists that menu icons conform to their guidelines and most of the terms used here should be familiar to anyone who has designed an icon before. The designated drawable folders allow us to provide the best possible graphics for a wide variety of screen densities. Android will automatically select the most appropriate graphic for a handset or tablet so that we can refer to our icons generically with @drawable/. It is only ever necessary to provide icons for the first five menu items as the Icon Menu is the only type to allow icons.  
Read more
  • 0
  • 0
  • 1792

article-image-android-30-application-development-gps-locations-and-maps
Packt
22 Jul 2011
7 min read
Save for later

Android 3.0 Application Development: GPS, Locations, and Maps

Packt
22 Jul 2011
7 min read
  Android 3.0 Application Development Cookbook Design and develop rich smartphone and tablet applications for Android 3.0         Introduction For managing location based information, Android provides the android.location package which in turn gives us the LocationManager class that gives us access to location based functions such as the latitude and longitude of a device's position. Tracking a device over time is made equally convenient and the LocationListener class monitors changes in location as they occur. Listening for location changes is only a part of the story, as Google provides APIs for managing Google Maps data and displaying and manipulating maps through the use of the MapView and MapController classes. These powerful tools require us to sign up with Google first, and once done enable us to zoom in and out of maps, pan to any location that we are looking for, and when we want to, include application information on a map, and even add our own layers to maps and mark locations on a Google map. Detecting a device's location Android locations are expressed in terms of latitude and longitude coordinates. The default format is degrees. The Location object can also be used to store a time-stamp and other information such as speed and distance traveled. Although obtaining a device's last known location does not always yield the most accurate information, it is often the first reading that we may want. It is fast, simple to employ, and makes a good introduction to the LocationManager. <uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" /> How to do it... Use the TextView provided in the main.xml file and give it a resource ID: android:id="@+id/text_view" Declare a TextView as a class-wide field in the Java activity code: TextView textView; Then, find it in the usual way, from within the onCreate() method: textView = (TextView) findViewById(R.id.text_view); Next, and still within onCreate(), declare and define our LocationManager: LocationManager manager = (LocationManager) getSystemService(Context.LOCATION_SERVICE); Then, to retrieve the last known location using GPS and display this in the text view, add these lines: Location loc = manager.getLastKnownLocation(LocationManager.GPS_PROVIDER); textView.setText("latitude: " + loc.getLatitude() + "nlongitude: " + loc.getLongitude()); Run the code on a handset or emulator to obtain its location: How it works... The use of a LocationManager to obtain the device's last known location is very straightforward. As with other system services, we obtained it with getSystemService() and the getLastKnownLocation() method returns the Location object itself, which can be further queried to provide latitude and longitude coordinates. We could have done more with the Location object, for example Location.getAltitude() will return altitude and getDistance(Location) and getBearing(Location) will return distance and bearing to another Location. It is possible to send mock locations to an emulator using the DDMS perspective in Eclipse: Before sending location data this way, make sure that you have set the emulator to allow mock locations under Settings | Applications | Development. It is worth noting that although use of the getLastKnownLocation() method may not always be accurate, particularly if the device has been switched off for some time, it does have the advantage of yielding almost immediate results. There's more... Using GPS to obtain a location has a couple of drawbacks. Firstly, it does not work indoors; and secondly, it is very demanding on the battery. Location can be determined by comparing cell tower signal strengths, and although this method is not as accurate, it works well indoors and is much more considerate to the device's battery. Obtaining a location with a network provider The network provider is set up in exactly the same way as the previous GPS example, simply exchange the Location declaration with: Location loc = manager.getLastKnownLocation(LocationManager.NETWORK_PROVIDER); You will also need to change, or amend, the permission in the manifest file with: <uses-permission android_name="android.permission.ACCESS_COURSE_LOCATION" /> Listening for location changes Obtaining the last known location as we did in the previous recipe is all well and good and handy for retrieving a Location quickly, but it can be unreliable if the handset has been switched off or if the user is on the move. Ideally we want to be able to detect location changes as they happen and to do this we employ a LocationListener. In this recipe we will create a simple application that keeps track of a mobile device's movements. Getting ready This task can be performed most easily by starting where the previous one left off. If you have not completed that task yet, do so now—it is very short—then return here. If you have already completed the recipe then simply open it up to proceed. How to do it... First, move the declaration of our LocationManager so that it is a class-wide field: LocationManager manager; In the main Java activity code, before the TextView.setText() call, add the following three lines: LocationListener listener = new MyLocationListener(); manager.requestLocationUpdates(LocationManager.GPS_PROVIDER, 30000, 50, listener); Location location = manager.getLastKnownLocation(LocationManager.GPS_PROVIDER); Now create an inner class called MyLocationListener that implements LocationListener: LocationListener: public class MyLocationListener implements LocationListener { } Eclipse will most likely insist that you add some unimplemented methods and you should do so. For now, only complete one of them, the onLocationChanged() callback: @Override public void onLocationChanged(Location l) { textView.setText("/n/nlatitude: " + l.getLatitude() + "nlongitude: " + l.getLongitude()); } Leave the others as they are: @Override public void onProviderDisabled(String provider) {} @Override public void onProviderEnabled(String provider) {} @Override public void onStatusChanged(String provider, int status, Bundle extras) {} If you want to test this code on an emulator, then go right ahead. However, this code will create a serious drain on the battery of a handset, and it is wise to switch our listener off when it is not needed. Here we have used the activity's onPause() and onResume() functions to control this. You may wish to include these statements in any part of your activity's life cycle that suits your application's purpose: @Override protected void onResume() { super.onResume(); manager.requestLocationUpdates(LocationManager.GPS_PROVIDER, 30000, 50, listener); } @Override protected void onPause() { super.onPause(); manager.removeUpdates(this); } If you have not already tested this application, do so now. You will need to move around if you are testing it on a real device, or send mock locations to an emulator to see the code in action: How it works... In this recipe we used the LocationManager to provide location updates roughly every 30 seconds (30000 milliseconds) or whenever the location changed by more than 50 meters. We say 'roughly' because these values work only as a guide and the actual frequency of updates often varies from the values we set. Nevertheless, setting these two parameters of the requestLocationUpdates() method to high values can make a big difference to the amount of battery power the GPS provider consumes. Hopefully the use of the provider and the LocationListener as the other two parameters is self explanatory. The LocationListener operates very much as other listeners do and the purpose of the onProviderEnabled() and onProviderDisabled() should be clear. The onStatusChanged() method is called whenever a provider becomes unavailable after a period of availability or vice versa. The int, status can represent 0 = OUT_OF_SERVICE, 1 = TEMPORARILY_UNAVAILABLE, or 2 = AVAILABLE.  
Read more
  • 0
  • 0
  • 2361
article-image-manifest-assurance-security-and-android-permissions-flash
Packt
29 Jun 2011
8 min read
Save for later

Manifest Assurance: Security and Android Permissions for Flash

Packt
29 Jun 2011
8 min read
Setting application permissions with the Android Manifest file When users choose to install an application on Android, they are always presented with a warning about which permissions the application will have within their particular system. From Internet access to full Geolocation, Camera, or External Storage permissions; the user is explicitly told what rights the application will have on their system. If it seems as though the application is asking for more permissions than necessary, the user will usually refuse the install and look for another application to perform the task they need. It is very important to only require the permissions your application truly needs, or else users might be suspicious of you and the applications you make available. How to do it... There are three ways in which we can modify the Android Manifest file to set application permissions for compiling our application with Adobe AIR. Using Flash Professional: Within an AIR for Android project, open the Properties panel and click the little wrench icon next to Player selection: The AIR for Android Settings dialog window will appear. You will be presented with a list of permissions to either enable or disable for your application. Check only the ones your application will need and click OK when finished. Using Flash Builder: When first setting up your AIR for Android project in Flash Builder, define everything required in the Project Location area, and click Next. You are now in the Mobile Settings area of the New Flex Mobile Project dialog. Click the Permissions tab, making sure that Google Android is the selected platform. You will be presented with a list of permissions to either enable or disable for your application. Check only the ones your application will need and continue along with your project setup: To modify any of these permissions after you've begun developing the application, simply open the AIR descriptor file and edit it as is detailed in the following sections. Using a simple text editor: Find the AIR Descriptor File in your project. It is normally named something like {MyProject}-app.xml as it resides at the project root. Browse the file for a node named <android> within this node will be another called <manifestAdditions> which holds a child node called <manifest>. This section of the document contains everything we need to set permissions for our Android application. All we need to do is either comment out or remove those particular permissions that our application does not require. For instance, this application needs Internet, External Storage, and Camera access. Every other permission node is commented out using the standard XML comment syntax of <!-- <comment here> -->: <uses-permission name="android.permission.INTERNET"/> <uses-permission name="android.permission.WRITE_EXTERNAL_ STORAGE"/> <!--<uses-permission name="android.permission.READ_PHONE_ STATE"/>--> <!--<uses-permission name="android.permission.ACCESS_FINE_ LOCATION"/>--> <!--<uses-permission name="android.permission.DISABLE_ KEYGUARD"/>--> <!--<uses-permission name="android.permission.WAKE_LOCK"/>-- > <uses-permission name="android.permission.CAMERA"/> <!--<uses-permission name="android.permission.RECORD_ AUDIO"/>--> <!--<uses-permission name="android.permission.ACCESS_ NETWORK_STATE"/>--> <!--<uses-permission name="android.permission.ACCESS_WIFI_ STATE"/>--> How it works... The permissions you define within the AIR descriptor file will be used to create an Android Manifest file to be packaged within the .apk produced by the tool used to compile the project. These permissions restrict and enable the application, once installed on a user's device, and also alert the user as to which activities and resources the application will be given access to prior to installation. It is very important to provide only the permissions necessary for an application to perform the expected tasks once installed upon a device. The following is a list of the possible permissions for the Android manifest document: ACCESS_COARSE_LOCATION: Allows the Geoloctaion class to access WIFI and triangulated cell tower location data. ACCESS_FINE_LOCATION: Allows the Geolocation class to make use of the device GPS sensor. ACCESS_NETWORK_STATE: Allows an application to access the network state through the NetworkInfo class. ACCESS_WIFI_STATE: Allows and application to access the WIFI state through the NetworkInfo class. CAMERA: Allows an application to access the device camera. INTERNET: Allows the application to access the Internet and perform data transfer requests. READ_PHONE_STATE: Allows the application to mute audio when a phone call is in effect. RECORD_AUDIO: Allows microphone access to the application to record or monitor audio data. WAKE_LOCK: Allows the application to prevent the device from going to sleep using the SystemIdleMode class. (Must be used alongside DISABLE_KEYGUARD.) DISABLE_KEYGUARD: Allows the application to prevent the device from going to sleep using the SystemIdleMode class. (Must be used alongside WAKE_LOCK.) WRITE_EXTERNAL_STORAGE: Allows the application to write to external memory. This memory is normally stored as a device SD card. Preventing the device screen from dimming The Android operating system will dim, and eventually turn off the device screen after a certain amount of time has passed. It does this to preserve battery life, as the display is the primary power drain on a device. For most applications, if a user is interacting with the interface, that interaction will prevent the screen from dimming. However, if your application does not involve user interaction for lengthy periods of time, yet the user is looking at or reading something upon the display, it would make sense to prevent the screen from dimming. How to do it... There are two settings in the AIR descriptor file that can be changed to ensure the screen does not dim. We will also modify properties of our application to complete this recipe: Find the AIR descriptor file in your project. It is normally named something like {MyProject}-app.xml as it resides at the project root. Browse the file for a node named <android> within this node will be another called <manifestAdditions>, which holds a child node called <manifest>. This section of the document contains everything we need to set permissions for our Android application. All we need to do is make sure the following two nodes are present within this section of the descriptor file. Note that enabling both of these permissions is required to allow application control over the system through the SystemIdleMode class. Uncomment them if necessary. <uses-permission android_name="android.permission.WAKE_LOCK" /> <uses-permission android_name="android.permission.DISABLE_ KEYGUARD" /> Within our application, we will import the following classes: import flash.desktop.NativeApplication; import flash.desktop.SystemIdleMode; import flash.display.Sprite; import flash.display.StageAlign; import flash.display.StageScaleMode; import flash.text.TextField; import flash.text.TextFormat; Declare a TextField and TextFormat pair to trace out messages to the user: private var traceField:TextField; private var traceFormat:TextFormat; Now, we will set the system idle mode for our application by assigning the SystemIdleMode.KEEP_AWAKE constant to the NativeApplication.nativeApplication.systemIdleMode property: protected function setIdleMode():void { NativeApplication.nativeApplication.systemIdleMode = SystemIdleMode.KEEP_AWAKE; } We will, at this point, continue to set up our TextField, apply a TextFormat, and add it to the DisplayList. Here, we create a method to perform all of these actions for us: protected function setupTraceField():void { traceFormat = new TextFormat(); traceFormat.bold = true; traceFormat.font = "_sans"; traceFormat.size = 24; traceFormat.align = "left"; traceFormat.color = 0xCCCCCC; traceField = new TextField(); traceField.defaultTextFormat = traceFormat; traceField.selectable = false; traceField.multiline = true; traceField.wordWrap = true; traceField.mouseEnabled = false; traceField.x = 20; traceField.y = 20 traceField.width = stage.stageWidth-40; traceField.height = stage.stageHeight - traceField.y; addChild(traceField); } Here, we simply output the currently assigned system idle mode String to our TextField, letting the user know that the device will not be going to sleep: protected function checkIdleMode():void { traceField.text = "System Idle Mode: " + NativeApplication. nativeApplication.systemIdleMode; } When the application is run on a device, the System Idle Mode will be set and the results traced out to our display. The user can leave the device unattended for as long as necessary and the screen will not dim or lock. In the following example, this application was allowed to run for five minutes without user intervention: How it works... There are two things that must be done in order to get this to work correctly and both are absolutely necessary. First, we have to be sure the application has correct permissions through the Android Manifest file. Allowing the application permissions for WAKE_LOCK and DISABLE_KEYGUARD within the AIR descriptor file will do this for us. The second part involves setting the NativeApplication.systemIdleMode property to keepAwake. This is best accomplished through use of the SystemIdleMode.KEEP_AWAKE constant. Ensuring that these conditions are met will enable the application to keep the device display lit and prevent Android from locking the device after it has been idle.
Read more
  • 0
  • 0
  • 2525

article-image-top-5-must-have-android-applications
Packt
28 Jun 2011
6 min read
Save for later

Top 5 Must-have Android Applications

Packt
28 Jun 2011
6 min read
  Android Application Testing Guide Build intensively tested and bug free Android applications     1. ES File Explorer Description: ES File Explorer has everything you would expect from a file explorer – you can copy, paste, rename and delete files. You can select multiple files at a time just as you would on your PC or MAC. You can also compress files to zip or gz. One of the best features of ES file explorer is the ability to connect to network shares – this means you can connect to a shared folder on your LAN and transfer files to and from your Android device. Its user interface is very simple, quick and easy to use. Screenshots:   Features: Multiselect and Operate files (Copy, Paste, Cut/Move, Create, Delete and Rename, Share/Send) in the phone and computers Application manager -- Manage apps(Install, Uninstall, Backup, Shortcuts, Category) View Different file formats, photos, docs, videos anywhere, support third party applications such as Document To Go to open document files Text viewers and editors Bluetooth file transfer tool Access your Home PC, via WIFI with SMB Compress and Decompress ZIP files, Unpack RAR files, Create encrypted (AES 256 bit) ZIP files Manage the files on the FTP server as the ones on the sd card Link: This application is available for download at: https://market.android.com/details?id=com.estrongs.android.pop 2. Go Sms Pro Description: GO SMS Pro is the ultimate messaging application for Android devices. There is a nice setting that launches a pop-up for incoming messages. Users can then respond or delete directly within the window. The app supports batch actions for deleting, marking all, or backing up. It is highly customizable; everything from the text color, to the color of the background, SMS ringtones for specific contacts, and themes for the SMS application can be customized. Another interesting feature that you can take advantage of is the multiple plug-ins that are available as free download in the market. The Facebook chat plug-in makes it possible to receive and send Facebook chat messages and since these messages are sent through the Facebook network it does not affect your SMS messages at all. Screenshots: Features: GO-MMS service (FREE), you may send picture/music to your friend(ever they are no GO SMS) through one SMS with 2G/3G/4G or WIFI Many cool themes; also support DIY theme, and Wallpaper Maker plug-in; Fully customizable look; Supports chat style and list style; Font changeable SMS backup and restore by all or by conversations, supports XML format, send backup file by email Support schedule SMS; Group texting Settings backup and restore Notification with privacy mode and reminder notification Security lock, support lock by thread; Blacklist Link: This application is available for download at: https://market.android.com/details?id=com.jb.gosms 3. Dolphin HD browser Description: Dolphin Browser HD is a professional mobile browser presented by Mobotap Inc. Dolphin Browser HD is the most advanced and customizable web browser. You can browse the Web with the greatest speed and efficiency by using Dolphin browser HD. The main browsing screen is clean and uncluttered. Other than the Home and Refresh buttons that flank the address bar, Dolphin HD doesn't clutter the main interface with other quick-access buttons. In addition to tabbed browsing, bookmarking (that syncs to Google bookmarks), and multitouch zooming, it can also flag sites to read later as well as a tie-in to Delicious. You can search content within a page, subscribe to RSS feeds through Google Reader, and share links with social networks. Another great feature of this app is the capability to download YouTube videos. Screenshots: Features: Manage Bookmarks Multi Touch pinch zoom Unlimited Tabs Colorful theme pack Gestures as shortcuts for common commands You can save web pages to read them offline with all images preserved Link: This application is available for download at: https://market.android.com/details?id=mobi.mgeek.TunnyBrowser 4. Winamp Description: The one big advantage that Winamp has over other playback apps is that it can sync tracks wirelessly to the device over your home network, so you don’t have to fuss with a USB cable, making it easier to manage your music. You can set Winamp to sync automatically, every time you connect your Android phone to Winamp, which makes it incredibly easy to send new playlists, purchases and downloads to your portable player, sans USB. The interface is probably the most notable upgrade over the stock player. The playback controls remain on-screen pretty much wherever you are in the app – a small touch, but one that vastly improves the functionality of Winamp. Being able to control playback from any point is more useful than you might expect. Screenshots: Features: iTunes library & playlist import Wireless & wired sync with the desktop Winamp Media Player Over 45k+ SHOUTcast Internet radio stations Integrated Android Search & “Listen to” voice actions Play queue management Playlists and playlist shortcuts Extras Menu – Now Playing data interacts with other installed apps Link: This application is available for download at: https://market.android.com/details?id=com.nullsoft.winamp 5. Advanced Task Killer Description: One click to kill applications running in the background. Advanced Task Killer is pretty simple and easy to use. It allows you to see what applications are currently running and offers the ability to terminate them quickly and easily, thus freeing up valuable memory for other processes. It also remembers your selections, so the next time you launch it, the previously spared apps remain unchecked and the previously selected ones are checked and are ready to be shut down. You can choose to have Advanced Task Killer start at launch and there’s even the option to have it appear in your notifications bar for swift access. Screenshots: Features: Kill multiple apps with one tap Adjust the security levels It comes with a notification bar icon You can kill apps automatically by selecting one of auto-kill level: Safe, Aggressive or Crazy Link: This application is available for download at: https://market.android.com/details?id=com.rechild.advancedtaskkiller Summary In this article we discussed the Top 5 must-have applications for your android phone. Further resources on this subject: Android Application Testing: Getting Started [Article] Flash Development for Android: Audio Input via Microphone [Article] Android Application Testing: TDD and the Temperature Converter [Article] Android User Interface Development: Animating Widgets and Layouts [Article]
Read more
  • 0
  • 1
  • 3541