This is default featured slide 1 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

Wednesday, 27 March 2013

Why Mobile App Testing is So Hard


Not that you need reminding, but mobile app testing is difficult. Beyond the fact that it’s still a fairly new medium and testers are still figuring out the nuances of mobile app testing, the single biggest challenge is the sheer size of the mobile ecosystem.
Even when focusing on a single operating system, the number of different devices, screen sizes and OS versions – not to mention carriers and locations – make testing a single app quite complicated. To illustrate this, let’s take a look at this recent Engadget article discussing the launch of the BBCSport app for Android. According to the article, the BBC’s newest app “is compatible with Android devices running version 2.2 or above and while the Beeb has worked to ensure it works on the recent wave of 7-inch tablets, it doesn’t currently support larger sizes.”
Encompassing smartphones and tablets means the app has to be testing on too many different screen sizes to count. In fact, here’s an image Engadget published of just some of the devices BBC used to test the Android app.

I count 13 devices in that image. Beyond handhelds, developers and testers also have to contend with OS versions. According to Android Developers’ latest count, version 2.2 and up account for nearly 98% of the Android user base. That’s pretty good coverage so it’s easy to see why BBC chose to support versions 2.2 and newer. How many versions is that you ask? Five, and all their accompanying sub-versions. Android 2.2 is nicknamed Froyo, and sticking with the alphabetical desserts theme, Android is now on Jelly Bean. That means this app works on (and was presumably tested on) Froyo, Gingerbread, Honeycomb, Ice Cream Sandwich and Jelly Bean. If you’re being thorough, each version needs to be tested on multiple devices to see how each combination reacts.
So for this one app the BBC had to test on a slew of different devices, across the major carriers in the UK and on five different OS versions. That’s what makes mobile application testing so complicated. (And why in-the-wild testing makes it some much easier.)

How to Track the App Economy’s Daily Movements


Part of having a successful app is understanding the app economy as a whole – or, at the very least, the state of the category you’re about to enter. Number of downloads is one metric. Longevity and number of versions/updates are a few more. 



Star ratings and reviews offer some insight. But how can you judge the overall health of the app market? That’s the question the Applause Index set out to answer. The brand new index is “the first-ever way to measure and track the state of user satisfaction in the mobile apps economy.” Here’s how it works, from the Applause App Analytics Blog:
Similar to the Dow Jones Industrial Average, the Applause Index provides a daily look at how users feel about the bellwether iOS and Android apps and assigns a weighted, cumulative score (again, similar to those used to measure the overall stock market). This enables journalists, analysts and mobile decision makers to understand trends in the macro apps economy and make more informed decisions.
The Applause Index also includes a full complement of sub-indices that measure specific categories of market-representative apps across ten key categories, including Gaming; Lifestyle; Entertainment, Travel and Content.
The Applause Index is calculated based upon the Applause Score of 60 iOS and 60 Android apps that meet rigorous selection criteria. It then applies a weighted formula to produce a daily cumulative score across these bellwethers. The result is a top-down look at the overall sentiment of users about the apps economy; across categories; across app stores; and over time.
Those bellwether apps include Angry Birds, Evernote, Yelp, Netflix, PayPal, Quora, NFL Mobile and Skype, just to name a few. Here’s a few examples of the criteria used to select the indexed apps:
  • Number of app versions and app reviews
  • Length of time an app has been available
  • Performance data across Applause Attributes
  • Representative allocation across app store categories
  • 100% overlap between the list of 60 iOS and 60 Android apps
To learn more bout the Applause Index, read the full post on the App Analytics Blog, or check the index out for yourself! To see how your apps, your competition’s apps or your favorite apps are faring in the hands of users, look them up on Applause.

Tuesday, 26 March 2013

Picking the right handsets for your project


We all know that the mobile world is dynamic, plenty of new handsets are being shipped at the same time in which we develop our product and testing on the (what we believe) is the “hottest” handsets in the market.
It is clear that being agile and fast in the way we develop, test and deploy our mobile products is a key to be attractive in the market, however it is also impossible to support all handsets and be ahead of the market.
So – The way to be up to date in the offering, is not simple but possible.
When you start developing your product keep in mind that by picking the “right” 10 handsets which are “hot” in the market you can reach the coverage of ~50% of the market (Note that there are lead devices which represent a whole family of handsets and can give you a lot of value by testing on it), as well if you go to ~30 devices you may reach up to ~80% coverage of the market.
How should you decide than?
The way to do the picking of handsets should combine the 2 following aspects:
- Market research
- Right family identification/lead devices
Market Research: The way to determine what is relevant in the market is to do some research and analysis – either through leading mobile blogs, or even simple – going through the leading mobile operators in the world, and seeing what they are currently selling (e.g. Vodafone global lists today in the top list of devices in Germany: Samsung Galaxy SIII, Samsung Galaxy SII, SEMC Xperia Arc S etc. –http://shop.vodafone.de/Shop/smartphones/, if you go to Vodafone UK you will see mostly the same ones, as well as HTC One X and others http://www.vodafone.co.uk/brands/android/index.htm)
Doing a matrix and unification of handsets between the world leading carriers in Europe/U.S/Asia should give you the lead handsets which you would like to support and test in the 3-6 months ahead (Per OS – Android, iOS, Windows Phone and BlackBerry).
Families: The aspect of family should be a subset of the above list If e.g you reached a common list of lets say 50 handsets, i am mostly certain that the list can be cut into half by doing proper comparison between the various handsets by their OS, Screen resolution and OEM (This can be done through sites like GSM Arena – http://www.gsmarena.com)  and minimizing the list by leads, members and families.
Please find attached to the post an up to date list of common handsets by OEM which is sold world wide these days to ease your pain :)   –> As you will see, there are a lot of similar handsets across all large operators which can show the main devices to focos on.
P.S: With regards to the leading Android/iOS tablets these days:
iOS – iPad 2 and iPad 3
Android – Samsung Galaxy Tab 10.1, Motorola Xoom, Asus Nexus 7, Dell Streak 7, Samsung Galaxy Tab 7, Sony Tablet S. Asus Transformer TF300, Asus Transformer TF700

Cross browser comparison (Focus on iOS)


t is a fact that more and more hybrid/web application are being developed lately, HTML5 applications and more
The common assumption by the application developers is that since it is a web application is will run cross platform without too much effort/QA and UI activities.
It might be true in some cases (simple ones), however on top of the complications in testing web application, we must not forget that each app need to be compliant to iOS and Android UI guidelines (icons, fonts etc.), and also important point to keep in mind is – The cross browser compatibility.
Each user (Android or iOS and soon Windows Phone) may chose the prefered browser which he likes to surf through and use. THat user will not change his browser and will expect that “your” application will run top-notch on his preferred browser.
In this post i will not cover all existing iOS browsers (nor android), but the top leading ones – My recommendation is to prepare similar matrix of testing for the existing web browsers for mobile and perform at least some level of sanity on each to assure your application works properly and also meets the desired guidelines.
For iOS we are familiar with the following browsers (The below iTunes URLs for downloading each:
- Safari (THe most common and default browser)
Please see below some screenshots of the exact same web page (BBC News) which is a high quality web site being run on the above browsers (Not too much difference which is good news :) , however there are some L&F differences – for other apps i am sure that the situation will be different)
Safari Browser:
Google Chrome:
Mercury Browser:
Dolphin Browser:
Opera Mini:
Not to forget that the above if appears quite different from one browser to the other will introduce additional challenges for the automation team.
To sum up, mobile known matrix of devices and os will also extend toward mobile
Browsers per platform, a vital thing to cover which will also complicate automation.

Windows Phone 8 handsets are starting to pop out


As many anticipated, we are starting to see more and more investment in the new Windows Phone platform, by many OEM’s and not only Nokia which is collaborating for a while with Microsoft.
In this short post, i will list the new upcoming Windows Phone 8 phones which you will soon start to see.
HTC:
HTC is announcing the launch of its new Windows Phone 8 phone called HTC Accord.
The phone comes with a 1.5 GHZ dual core SnapDragon processor, 4.3 ” Screen, 8 MP camera, External microSD card, NFC support, as well rumors says that the phone will have support for LTE communication.
Samsung:
Samsung announced its new ATIV-S Windows Phone 8 phone with the following characteristics: Super AmoLed 4.8 ” Screen, 1.5 GHZ dual core processor, Full HD 8 MP rear camera, with a 1.9 MP front camera. support for an external MicroSD card (which is new in the WP platform) and NFC support!. The phone which is running the WP8 will support Internet Explorer 10 browser, Mobile Office suite and the new cloud storage service SkyDrive.
The ATIV brand actually starts a new line of products by Samsung for WP8 (ATIV Tab 10.1”ATIV Smart PC and more)
Nokia:
Nokia which is of course the WP platform pioneer, is also announcing on 2 new phones running WP8 called Nokia Lumia 920 and Nokia Lumia 820.
The Nokia Lumia 920 will come with a 4.5” screen, and the Nokia Lumia 820 with a smaller screen of 4.3”
The news around these 2 phone is about their support in the new PureView camera technology, which for these 2 phones will give a 21 Mega Pixel support.
Summary:
As I always state, the mobile world is dynamic and constantly changing, and we are already seeing that even for the new Windows Phone platform the biggest OEM’s are starting to dive in so it will be interesting to see how such change impacts the mobile market, and the existing iOS/Android and the RIM platforms.
From testing perspective we also see variety of new screen sizes which was and will always be a challenge for testers and test automation (Above we mention already 4.3”, 4.5”, 4.8”). Tablets for WP8 as you saw above are also starting to be deployed extending this platform market.

Best practices for iOS mobile application testing


iOS changed the mobility game, no doubt about it. It paved the way for the ‘mobile era’ by offering amazing functionality with a simple user experience.  However when it comes to testing and monitoring, working with the iPhone/iPad mobile application can be anything but simple…
As the iOS app market continues to produce record growth, challenges and complexities surrounding iOS application testing also continue to interfere with development. A key challenge of iOS testing is that, unlike the open-source Android OS, Apple iOS is a closed operating system. Added complexity during the development and testing stages arises with a closed system, since users can’t extract necessary data from low level objects, which are essential for test automation. So, what’s the best approach for getting the necessary level of access to the iOS device – rooting (jailbreaking) or compile-time source instrumentation? Should you base your testing on native objects or OCR-based screen analysis?
Let’s take a deeper look into some of these challenges and why a cloud-based hybrid approach is important to offer developers and testers the necessary coverage, capabilities and flexibility to deliver better iOS apps and deploy them with confidence.

Rooting (jailbreaking) vs. Source Instrumentation (compile-time)

There are two common methods used today in the mobile testing industry to address this challenge (i.e. access to the low level objects): rooting (jailbreaking) and source instrumentation (i.e. compile-time solution).
Jailbreaking refers to the process of removing the limitations placed by Apple on the iOS device in order to get low level (root) access to the operating system. This allows the tester to be able to recognize the objects within the application being tested.
Source Instrumentation is performed by compiling the application being tested with an additional piece of code that provides access (“back door”) to the low level OS for object recognition. This code enables the tester to execute the low level calls and get the Object ID’s from the operating systems (without the need to root/jailbreak the device).
The decision as what approach to adopt strongly depends on several considerations (below are just few):
1)    The used SDLC process
2)    Corporate policies
3)    Application under test
4)    Frequency of testing
Perfecto Mobile provides its end users with the freedom to choose what fits them best, while taking into consideration the advantages and disadvantages of each approach. When customers need to quickly test either a new iOS version or a new iOS device, the jailbreaking approach is less suitable. In such a case, the compile-time method is preferable – even though it complicates the SDLC by introducing additional code to the application being tested.
On the other hand, using a jailbroken device lets you test the application with the exact code by which it will be released (compile-time mandates that before store submission, you will remove the “back-door” or be exposed to serious security issues). This eliminates the need for compilation and intrusive operations which could potentially pose a risk to quality. Companies using a compile-time approach should also consider possible regulations (such as HIPAA) which enforce testing on the final binary (and not on debug version, test friendly version, etc.)
The combined (hybrid) approach lets you choose which type of tests to implement on which iOS device according to the nature of your application, project needs, and policy. When the test devices are deployed and securely managed in a “private cloud” (such as that offered by Perfecto Mobile), such a configuration guarantees that the jailbreak method does not introduce any risks or abuse of the platform for non-testing purposes. The jailbroken device is used only for testing purposes in a closed and secure testing environment. This is analogous to the use the way iOS devices used for development have a “developer signature,” as well as the way Android devices used for development have more levels of access than those required during the normal ALM cycle.

The Need for a Hybrid Approach to Object Recognition

Testing a mobile application requires strong object recognition capabilities. The use of visual analysis might not be sufficient, for example, the OCR technology can detect UI issues and glitches on the test devices, but cannot ensure 100% accuracy due to its heuristic nature. On the other hand, low level objects might “miss” the obvious qualification that a visual analysis can easily detect. That’s why a hybrid approach incorporating both visual and Native object analysis is imperative for covering all mobile business cases. Such an approach is supported by Perfecto Mobile.

Object level analysis vs. Visual analysis

This screenshot above shows the differences of using an object level analysis as opposed to visual analysis (object level analysis would not have detected the overlapping of the button on the text).

The Perfecto Mobile Approach: Go Cloud, Go Hybrid

Perfecto Mobile’s experience as a market leader has taught us that the best approach is to present each customer with all possible alternatives making them available inside the cloud.
Real devices + emulators (in the cloud),  OCR screen analysis + OS level native objects (in the cloud), rooted/jailbroken device + non-rooted/jailbroken devices (in the cloud)
With hundreds of thousands of automation hours running every month on our platform, we are well-positioned to suggest and guide, but not to “judge” what’s best for everyone…
Perfecto Mobile hybrid object support on a rooted android and a non-jailbroken iPhone