In the last couple of weeks (since avdmanager released?) we’ve received user reports about freezing, slow emulators, and the main issue is usually that UI tests are failing, however in another environments they run successfully. The same thing we’ve successfully reproduced and seems like newer api versions are concerned.
###Workarounds, tips…
If you need api level 23, 24 or 25: seems like google_apis ABIs are broken. The emulator starts, wait-for-android-emulator step receives the signal of a successful boot, however after couple of seconds it freezes fully.
default ABIs reported to be working (still waiting for report of this, please share!)
Creating emulators using (currently latest) Create Android Emulator v1.1.5 seems like causing boot timeout in any emulator.
waiting for reports of this
###We are working on this issue already
But still need more info.
###Let’s discuss this
This topic is created for collecting global feedbacks and reports from you . If you have anything to say about Android emulators, please instantly share it with us!
Currently, there is no default system image for api 25, only google_apis.
root@7871ff1d62c6:/bitrise/src#sdkmanager “system-images;android-25;default;armeabi-v7a”
Warning: File /root/.android/repositories.cfg could not be loaded.
Error: Failed to find package system-images;android-25;default;armeabi-v7a
I have a bitrise app scheduled to build everyday with always latest steps and google_apis android-25 armeabi-v7a emulator. The source code is not changed. Usually, build succeeds but sometimes it fails due to emulator issues.
We also have scheduled tests and I noticed this unstable behavior as well. On some emulators boot timeout happens, on others the step can read the boot status as booted, but after couple of seconds the emulator freezes and becomes unresponsive. The other mystery is that emulator boots, not freezing, but the tests that are working on a local emulator, doesn’t works on Bitrise. The log usually says timeout, or the test runs fully but for any reason the test results gets back as failed.
This is a summary of testing tons of emulators.
Need to figure out a scheme that works. Find android apis, ABIs, etc… that works more than 80% of a time.
Tried with multiple variation of flags, and using HAXM also. I think the test result should not depend on the speed of the device. If something can be managed on a local pc, that should be able to be managed on Bitrise as well, just a lot slower.
ANR (or something similar like exceeded timeout) may occur in some of system components after boot if emulation is slow enough.
I’m also collecting logs produced by emulators’ logcat in that app. Haven’t analyzed it deeply, but here is one example which appears if tests are failing due to unknown API level and is not present normally:
We’re seeing the same thing.
Testing create emulator v 1.1.3 launching emulator API 19 with play services, and it never boots.
The emulator starts but seems to get stuck on boot screen.
I wasn’t able to see how you guys check for emulator boot since that’s in your adbManager (which isn’t public?)
Here is the output from the moribund emulator
UPDATE:
Using same step versions, start-android-emulator still times out after 120 seconds using a default Emulator (API 19 with NO google_apis)
Thanks @koral.
There must be some issue with the images. Locally querying the system properties, none of them return a desired value.
I’ll look to see if i can change some AVD settings.
Is there a way to verify if the images being used have changed?
Tested around, and seems like android-24;armeabi-v7a;default working absolutely fine. Of course -no-window, -gpu off.
However, espresso UI test is a super sensitive stuff. On emulators without HAXM and KVM, the results are really random of a test, for example a click detected as long-click, and View events called before the layout is really inflated. So speed is really important in case of espresso UI tests. To confirm this I started an x86 emulator with HAXM enabled and ran multiple CPU heavy stuff on my PC and in the emulator itself as well to make it a bit slower. It was an easy test to confirm this, because the tests started to randomly fail.
Not sure if understand correctly but if you want to check if particular file has changed you can e.g. compare its hash (obtained using sha256sum or so) to the value calculated before performing any actions.
Is there something relevant printed to the logcat or emulator kernel log?
Not really. Usually fails the test like the app doesn’t do what is should.
Espresso tests are listening for View events, sometimes it receives the event but I clearly can see from the emulator that the exact view is not even inflated. This results a NullReferenceException when the test engine want’s to ask a textView’s content from the received event’s view ID, but as it is not inflated, it cannot be reached yet. Not sure what’s causing this.
Actually I could not find a method to “slow down” espresso UI tests, any suggestion?
Tried this and it still times out. At this point I believe i’ve tried every emulator and none are working for us.
We need a resolution! a build server isn’t very useful if it’s not building.
Separate calls, and put a little to be enough delay between them, like this:
ViewInteraction x = onView(withId(R.id.auto_complete_text_view));
x.perform(typeTextIntoFocusedView("South "), closeSoftKeyboard());
SystemClock.sleep(200);
// Should be displayed
x = onView(withText("South China Sea"));
SystemClock.sleep(200);
x.inRoot(withDecorView(not(is(mActivity.getWindow().getDecorView()))));
SystemClock.sleep(200);
x.check(matches(isDisplayed()));
SystemClock.sleep(200);
// Should not be displayed.
x = onView(withText("Southern Ocean"));
SystemClock.sleep(200);
x.inRoot(withDecorView(not(is(mActivity.getWindow().getDecorView()))));
SystemClock.sleep(200);
x.check(doesNotExist());
This is the only way I’ve managed to make all the tests to be successful. The above is just an example, can be any sleep times, or implementation. The main reason for this is only to handle the delay between UI events and the actual UI view states which occurs on slower devices.
Do not put sleeps in Espresso tests. That is not a “solution.” Companies investing in automation require tests to pass consistently and quickly. Neither of those goals are possible with sleeps. Espresso as a framework is designed around the concept of idle resource listeners. The idea is the app tells Espresso when to wait or not, without having to hardcode this into the tests. The Android emulators provided by Google are not suitable for Espresso testing at scale. They are simply too flaky. Google’s Firebase Test Lab is the best option in the market today for virtual device testing on Android.
Tried both, without success. Sometimes 1 test of 4 became successful, but it is still un-trustable.
Yep. There is the issue. These view events are fired before the view is inflated to be able to get it. = continuous nullrefexceptions. This is what I wanted to fix. As I can see the apps nowaday, developers have no goal of the user should be able to navigate through 6 screens, 20 UI element and couple of other features within 0.5 seconds. Rather they want to know if the components are working, not causing crashes, and shows what it should btw.
I understand that this cannot be a solution for everyone. The tests I used it doesn’t matter if the list is filled within 50ms or 200ms.