Supporting_Different_Languages_and_Cultures Learn how to support multiple languages and cultures using alternative
Supporting_Different_Languages Learn how to support multiple languages with alternative string
****** Supporting Different Languages and Cultures ****** Create_Locale_Directories_and_Resource_Files Use_the_Resources_in_your_App Apps include resources that can be specific to a particular culture. For example, an app can include culture-specific strings that are translated to the language of the current locale. It's a good practice to keep culture-specific resources separated from the rest of your app. Android resolves language- and culture-specific resources based on the system locale setting. You can provide support for different locales by using the resources directory in your Android project. You can specify resources tailored to the culture of the people who use your app. You can provide any resource_type that is appropriate for the language and culture of your users. For example, the following screenshot shows an app displaying string and drawable resources in the device's default (en_US) locale and the Spanish (es_ES) locale. [https://developer.android.com/images/training/languages_01.png] Figure 1. App using different resources depending on the current locale ***** Create Locale Directories and Resource Files ***** To add support for more locales, create additional directories inside res/. Each directory's name should adhere to the following format: -b+[+] For example, values-b+es/ contains string resources for locales with the language code es. Similarly, mipmap-b+es+ES/ contains icons for locales with the es language code and the ES country code. Android loads the appropriate resources according to the locale settings of the device at runtime. For more information, see Providing_Alternative_Resources. After you've decided on the locales to support, create the resource subdirectories and files. For example: values-b+es/ mipmap/ country_flag.png mipmap-b+es+ES/ country_flag.png For example, the following are some different resource files for different languages: English strings (default locale), /values/strings.xml: Spanish strings (es locale), /values-es/strings.xml: ¡Hola Mundo! United States' flag icon (default locale), /mipmap/country_flag.png: [https://developer.android.com/images/training/languages_us_flag.png] Figure 2. Icon used for the default (en_US) locale Spain's flag icon (es_ES locale), /mipmap-b+es+ES/country_flag.png: [https://developer.android.com/images/training/languages_es_flag.png] Figure 3. Icon used for the es_ES locale Note: You can use the locale qualifier (or any configuration qualifier) on any ***** Use the Resources in your App ***** You can reference the resources in your source code and other XML files using each resource's name attribute. In your source code, you can refer to a resource using the syntax R.<resource type>.. There are a variety of methods that accept a resource this way. In other XML files, you can refer to a resource with the syntax @<resource type>/ whenever the XML attribute accepts a compatible value. <ImageView android:src="@mipmap/country_flag" />
****** Supporting Different Languages ****** Create_Locale_Directories_and_String_Files Use_the_String_Resources It's always a good practice to extract UI strings from your app code and keep them in an external file. Android makes this easy with a resources directory in each Android project. ***** Create Locale Directories and String Files ***** To add support for more languages, create additional values directories inside res/ that include a hyphen and the ISO language code at the end of the directory name. For example, values-es/ is the directory containing simple resources for the Locales with the language code "es". Android loads the appropriate resources according to the locale settings of the device at run time. For more information, see Providing_Alternative_Resources. Once you've decided on the languages you will support, create the resource subdirectories and string resource files. For example: values-es/ values-fr/ strings.xml Add the string values for each locale into the appropriate file. At runtime, the Android system uses the appropriate set of string resources based on the locale currently set for the user's device. For example, the following are some different string resource files for different languages. English (default locale), /values/strings.xml: My Application Spanish, /values-es/strings.xml: Mi Aplicación Hola Mundo! French, /values-fr/strings.xml: Mon Application Bonjour le monde ! Note: You can use the locale qualifier (or any configuration qualifer) on any ***** Use the String Resources ***** You can reference your string resources in your source code and other XML files using the resource name defined by the element's name attribute. In your source code, you can refer to a string resource with the syntax R.string.. There are a variety of methods that accept a string resource this way. In other XML files, you can refer to a string resource with the syntax @string/ whenever the XML attribute accepts a string value. <TextView android:text="@string/hello_world" />
faster. Volley is available on GitHub. Volley may be able to help you streamline and improve the performance of your app's network operations.
faster. Volley is available through the open AOSP repository. Volley may be able to help you streamline and improve the performance of your app's network operations.
most importantly, faster. Volley is available on GitHub. The core Volley library is developed on GitHub and contains the main request dispatch pipeline as well as a set of commonly applicable utilities, available in the Volley "toolbox." The easiest way to add Volley to your project is to add the following dependency to your app's build.gradle file: git clone https://github.com/google/volley
most importantly, faster. Volley is available through the open AOSP repository. The core Volley library is developed in the open AOSP repository at frameworks/ volley and contains the main request dispatch pipeline as well as a set of commonly applicable utilities, available in the Volley "toolbox." The easiest way to add Volley to your project is to add the following dependency to your app's build.gradle file: git clone https://android.googlesource.com/platform/frameworks/volley
For interactive video training, use the following video course about Android Wear development: Start_the_video_course ** Iframe Removed from PDF **
To get started using the TIF Companion Library, add the following to your app's build.gradle file: compile 'com.google.android.libraries.tv:companionlibrary:0.1' distributed as a Gradle dependency through JCenter, and not with the Android SDK.
To get started using the TIF Companion Library, download the TV_Input_Service sample_app, and copy the library directory into your project's root directory. Then, add the following to your project's settings.gradle file: include ':library' In your app's build.gradle file, add the following to your dependencies: compile project(':library') distributed as part of the TV_Input_Service_sample_app, and not with the Android SDK.
**** Providing custom actions **** Android Auto displays playback controls on the user interface based on the PlaybackState. Apps should set the supported playback actions, such as play, pause, or skip track. Based on the supported actions, Android Auto will display the appropriate buttons on the screen. Android Auto apps must also support ACTION_PLAY_FROM_MEDIA_ID and ACTION_PLAY_FROM_SEARCH. Developers can use the PlaybackState.Builder to add additional playback custom actions. The order in which the custom actions are added determines the order in which they appear to the user. Each custom action requires an icon resource. Since apps that work with Auto are designed to run in cars with different screen sizes and densities, it is important that you provide your app's custom icons for different screen densities. This will help avoid blurring or other scaling artifacts. Here are some tips that you might find useful as you develop custom icons for your application. *** Use vector format where possible *** Use the vector format for custom icons whenever possible. A vector drawable allows you to scale assets without losing the detail. A vector drawable also makes it easy to align edges and corners to pixel boundaries at smaller resolutions. *** Provide drawables in multiple densities *** If you must provide icons as bitmap drawables (.png, .jpg, and .gif files) and Nine-Patch drawables (.9.png files), then as a minimum, supply a version of each icon that's optimized for the following common car screen densities: * mdpi (medium) ~160dpi * hdpi (high) ~240dpi * xhdpi (extra-high) ~320dpi It is preferred to have your custom icons in the following densities as well: * xxhdpi (extra-extra-high) ~480dpi * xxxhdpi (extra-extra-extra-high) ~640dpi (optional) For more information about designing for different screens, see the Supporting Multiple_Screens developer guide. *** Provide off icon style for disabled actions *** For cases when a custom action is unavailable for the current context, swap the custom action icon with a corresponding off icon style resource. [https://developer.android.com/images/training/icon_off.png] Sample off style custom action icons.
****** Testing Your App's Accessibility ****** ***** In this document ***** Manual_testing TalkBack Switch_Access Other_services Analysis_tools Accessibility_Scanner Node_tree_debugging UI_Automator_Viewer Lint Automated_testing Espresso Robolectric User_testing Making_Apps_More_Accessible Android_Design:_Accessibility Testing for accessibility lets you experience your app from the perspective of your users and find usability issues that you might otherwise miss. Accessibility testing can reveal opportunities to make your app more powerful and versatile for all your users, including those with disabilities. For the best results, use all of the approaches described in this doc: * Manual testing:Interact with your app using Android accessibility services. * Testing with analysis tools:Use tools to discover opportunities to improve your app's accessibility. * Automated testing:Turn on accessibility testing in Espresso and Robolectric. * User testing:Get feedback from real people who interact with your app. ***** Manual testing ***** Manual testing puts you in the shoes of your user. Android AccessibilityService objects change the way your app's content is presented to the user and how the user interacts with the content. By interacting with your app using accessibility services, you can experience your app as your users would. **** TalkBack **** TalkBack is Android's built-in screen reader. When TalkBack is on, users can interact with their Android device without seeing the screen. Users with visual impairments may rely on TalkBack to use your app. *** Turn on TalkBack *** Open your device's Settings app. Navigate to Accessibility and select TalkBack. At the top of the TalkBack screen, press On/Off to turn on TalkBack. In the confirmation dialog, select OK to confirm permissions. Note:The first time you enable TalkBack, a tutorial launches. To open the tutorial again in the future, navigate to Settings > Accessibility > TalkBack > Settings > Launch TalkBack tutorial. *** Explore your app with TalkBack *** After TalkBack is on, there are two common ways to navigate: * Linear navigation:Quickly swipe right or left to navigate through screen elements in sequence. Double-tap anywhere to select. * Explore by touch:Drag your finger over the screen to hear what's under your finger. Double-tap anywhere to select. To explore your app with TalkBack, complete these steps: Open your app. Swipe through each element in sequence. As you navigate, look for the following issues: * Does the spoken feedback for each element convey its content or purpose appropriately? (Learn how to write_meaningful_labels.) * Are announcements succinct, or are they needlessly verbose? * Are you able to complete the main workflows easily? * Are you able to reach every element by swiping? * If alerts or other temporary messages appear, are they read aloud? For more information and tips, refer to TalkBack_user_documentation. *** Optional: TalkBack developer settings *** TalkBack developer settings can make it easier for you to test your app with TalkBack. To view or change developer settings, complete these steps: Open your device's Settings app. Navigate to Accessibility and select TalkBack. Select Settings > Developer settings: Log output level: Select VERBOSE. Display speech output: Turn on this setting to view TalkBack speech output on the screen. Enable node tree debugging: Turn on this setting to expose the content of your screen's content to your device's logs. Learn more about node_tree_debugging. Optional: If you turned on Display speech output, you can rely only on the text output by turning off spoken feedback: In TalkBack Settings, select Speech volume > Match media volume. In your device Settings, select Sound, then set the media volume to zero. **** Switch Access **** Switch Access lets users interact with Android devices using a switch instead of the touch screen. There are several kinds of switches: assistive technology devices such as those sold by AbleNet, Enabling Devices, RJ Cooper, or Tecla*; external keyboard keys; or buttons. This service can be helpful for users with motor impairments. *Google does not endorse these companies or their products. *** Turn on Switch Access *** A simple way to configure Switch Access is with two switches. One switch is designated as the "Next" switch and moves focus around the screen, and a second "Select" switch selects the focused element. To use this two-switch method, you can use any pair of hardware keys. Note:Your experience with Switch Access may vary, depending on the tools and software that you're using: * If you use an external switch, such as a keyboard, there are additional setup steps. For example, you need to re-enable the soft keyboard. For more information, refer to the Switch_Access_user_documentation. * If you're using TalkBack 5.1 or later, a setup wizard is available to configure Switch Access. To use this wizard instead of the steps below, go to Settings > Accessibility > Switch Access > Settings > Open Switch Access setup. To set up Switch Access using the volume down key as the "Next" switch and the volume up key as the "Select" switch, complete the following steps: Ensure that TalkBack is turned off. Open your device's Settings app. Navigate to Accessibility and select Switch Access, then select Settings. On the Switch Access Preferences screen, make sure that Auto-scan is off. Use the volume down key as your "Next" switch: Touch Assign Keys for Scanning > Next. When the dialog opens, press the volume down key. The dialog shows KEYCODE_VOLUME_DOWN. Touch OK to confirm and exit the dialog. Use the volume up key as your "Select" switch: Touch Select. When the dialog opens, press the volume up key. The dialog shows KEYCODE_VOLUME_UP. Touch OK to confirm and exit the dialog. To return to Switch Access Preferences, press the back button. Optional: If you're using TalkBack 5.1 or later, you can select Spoken feedback to turn on spoken feedback. To return to the main Switch Access screen, press the back button. At the top of the Switch Access screen, press On/Off to turn on Switch Access. In the confirmation dialog, select OK to confirm permissions. *** Explore your app using Switch Access *** To explore your app with Switch Access, complete these steps: Open your app. To start scanning, press your "Next" switch (volume down). Continue pressing "Next" until you reach the item you want to select. To select the highlighted item, press your "Select" switch (volume up). As you navigate, look for the following issues: * Are you able to complete the main workflows easily? * If you have text or other inputs, can you add and edit content easily? * Are items highlighted only if you can perform an action with them? * Is each item highlighted only once? * Is all functionality that's available through touch screen gestures also available as selectable controls or custom actions within Switch Access? * If you're using TalkBack 5.1 or later and you've turned on spoken feedback, does the spoken feedback for each element convey its content or purpose appropriately? (Learn how to write_meaningful_labels.) *** Optional: Use group selection to see all scannable items *** Group selection is a Switch Access navigation method that lets you see all scannable items at once. This option lets you perform a quick check to see whether the correct elements on the screen are highlighted. To turn on group selection, complete these steps: Open your device's Settings app. Navigate to Accessibility and select Switch Access, then select Settings. On the Switch Access Preferences screen, make sure that Auto-scan is off. Select Scanning method > Group selection. Touch Assign switches for scanning. Make sure that the text under Group selection switch 1 and Group selection switch 2 shows that a switch is assigned to each. (If you already followed the steps under "Turn on Switch Access" above, the volume buttons should already be assigned.) To explore your app with Switch Access using group selection, complete these steps: Press the "Select" key (volume up) to highlight all actionable items on the current screen. Look for the following issues: * Are only actionable items highlighted? * Are all actionable items highlighted? * Does the density of highlighted items make sense? Navigate to a different screen to clear the highlight. To learn more about how users can navigate with group selection, see Tips_for using_Switch_Access. **** Other services **** *** BrailleBack *** BrailleBack is an application that lets users connect a refreshable braille display to an Android device over Bluetooth. BrailleBack works with TalkBack to provide a combined speech and braille experience. To test your app with a braille display, learn how to install_and_turn_on BrailleBack. To see the braille (and ASCII translation) that BrailleBack would render, without connecting a braille display, you can use the overlay option in BrailleBack settings: Open your device's Settings app. Navigate to Accessibility and select BrailleBack. Select Settings > Developer options > Show Braille output on screen. *** Voice Access (beta) *** Voice_Access lets users control an Android device with spoken commands. Voice Access is available in a limited English-only beta on devices running Android 5.0 (API level 21) and higher. To test your app with Voice Access, learn how to install_and_turn_on_Voice_Access. ***** Analysis tools ***** Testing with analysis tools can uncover opportunities to improve accessibility that you might miss with manual testing. **** Accessibility Scanner **** The Accessibility_Scanner app scans your screen and provides suggestions to improve the accessibility of your app. Accessibility Scanner uses the Accessibility_Testing_Framework and provides specific suggestions after looking at content labels, clickable items, contrast, and more. Learn more: * Get_started_with_Accessibility_Scanner * How_to_read_Accessibility_Scanner_results **** Node tree debugging **** Accessibility services use a separate representation of your app's UI to operate. As you debug, you might find it useful to view the hierarchy and attributes of UI elements in the same way accessibility services view them. To accomplish this task, you can use node tree debugging. This tool, available in TalkBack, provides information about how an AccessibilityService, such as TalkBack, views UI elements within your app. To learn more about how to use this tool, see Using_Node_Tree_Debugging. **** UI Automator Viewer **** The uiautomatorviewer tool provides a convenient GUI to scan and analyze the UI components currently displayed on an Android device. You can use UI Automator to inspect the layout hierarchy and view the properties of UI components that are visible on the foreground of the device. This information lets you create more fine-grained tests, for example by creating a UI selector that matches a specific visible property. The tool is located in the /tools/ directory. In accessibility testing, this tool is useful for debugging issues found using other testing methods. For example, if manual testing results in a view that does not have speakable text and should or a view that receives focus and should not, you can use the tool to help locate the source of the bug. To learn more about UI Automator Viewer, see Testing_UI_for_Multiple_Apps. **** Lint **** Android Studio shows lint warnings for various accessibility issues and provides links to the places in the source code containing these issues. In the following example, an image is missing a contentDescription attribute. The missing content description results in the following message: [Accessibility] Missing 'contentDescription' attribute on image Figure 1 shows an example of how this message appears in Android Studio: [https://developer.android.com/images/guide/topics/ui/accessibility/studio- missing-content-description.png]Figure 1.Message in Android Studio showing missing contentDescription attribute If users of accessibility services, such as screen readers, encountered this image within the app itself, they wouldn't be able to understand the image's meaning. ***** Automated testing ***** The Android platform supports several testing frameworks, including Espresso and Robolectric, which allow you to create and run automated tests that evaluate the accessibility of your app. To see a video overview of accessibility testing with Espresso and Robolectric, watch the following video from minute 31:54 to 34:19: Inclusive_design_and testing:_Making_your_app_accessible_-_Google_I/O_2016 **** Espresso **** Espresso is an Android testing library designed to make UI testing fast and easy. It allows you to interact with UI components under test in your app and assert that certain behaviors occur or specific conditions are met. *** Enable checks *** You can enable and configure accessibility testing through the AccessibilityChecks class: AccessibilityChecks.enable(); By default, the checks run when you perform any view action defined in ViewActions. The check includes the view on which the action is performed as well as all descendent views. You can check the entire view hierarchy of a screen using the following code: AccessibilityChecks.enable().setRunChecksFromRootView(true); For more code samples, see these demonstrative_tests. *** Suppress known issues *** When first enabling checks, you may encounter a number of issues you may not be able to deal with immediately. You can suppress test failures resulting from these issues by setting a matcher for the results that you would like to suppress. To do so, obtain an AccessibilityValidator object by calling the enable() method of the AccessibilityChecks class, then use the returned AccessibilityValidator's setSuppressingResultMatcher() method to configure a suppressing matcher. In the following example, all issues related to touch target size on View objects with a resource ID of "overflow" are suppressed: AccessibilityValidator validator = AccessibilityChecks.enable(); Matcher myMatcher = allOf( matchesCheckNames(is("TouchTargetSizeViewCheck")), matchesViews(withId(R.id.my_overflow))); validator.setSuppressingResultMatcher(myMatcher); **** Robolectric **** Robolectric is an open-source Android testing library that lets you test real Android code on a JVM, without needing to start an emulator. Learn how to get started_with_Robolectric. Note:UI testing with Robolectric has some shortcomings, so you should use other forms of testing in addition to this tool. For example, Robolectric cannot give reliable results for touch target size and duplicate clickable items. To detect these issues, consider using Accessibility_Scanner. *** Suppress known issues *** When first enabling checks for Robolectric, you may encounter a number of issues you may not be able to deal with immediately. You can suppress these errors by setting a matcher for the results that you would like to suppress. For more information, see the documentation for the setSuppressingResultMatcher () method of the AccessibilityUtil class that is available in Robolectric. ***** User testing ***** Along with the other testing methods in this guide, user testing can provide specific and valuable insights about the usability of your app. To find users who can test your app, use methods such as the following: * Reach out to local organizations, colleges, or universities that provide training for people with disabilities. * Ask your social circle. There might be people with disabilities who are willing to help. * Ask a user testing service (such as usertesting.com) if they can test your app and include users with disabilities. * Join an accessibility forum, such as Accessibility or Eyes-free, and ask for volunteers to try your app. For more tips, watch the user testing section of this video, from minute 31:10 to 44:51: Behind_the_scenes:_What's_new_in_Android_accessibility_-_Google_I/ O_2016
****** Accessibility Testing Checklist ****** ***** Checklist sections ***** Testing_Goals Testing_Requirements Testing_Recommendations Special_Cases_and_Considerations Testing_Accessibility_Features Testing_audible_feedback Testing_focus_navigation Testing_gesture_navigation Android_Design:_Accessibility Making_Applications_Accessible Testing is an important part of making your application accessible to users with varying abilities. Following design and development guidelines for accessibility are important steps toward that goal, but testing for accessibility can uncover problems with user interaction that are not obvious during design and development. This accessibility testing checklist guides you through the important aspects of accessibility testing, including overall goals, required testing steps, recommended testing and special considerations. This document also discusses how to enable accessibility features on Android devices for testing purposes. ***** Testing Goals ***** Your accessibility testing should have the following, high level goals: * Set up and use the application without sighted assistance * All task workflows in the application can be easily navigated using directional controls and provide clear and appropriate feedback ***** Testing Requirements ***** The following tests must be completed in order to ensure a minimum level of application accessibility. Directional controls: Verify that the application can be operated without the use of a touch screen. Attempt to use only directional controls to accomplish the primary tasks in the application. Use the keyboard and directional-pad (D- Pad) controls in the Android Emulator or use gesture_navigation on devices with Android 4.1 (API Level 16) or higher. Note: Keyboards and D-pads provide different navigation paths than accessibility gestures. While gestures allow users to focus on nearly any on- screen content, keyboard and D-pad navigation only allow focus on input fields and buttons. TalkBack audio prompts: Verify that user interface controls that provide information (graphics or text) or allow user action have clear and accurate audio descriptions when TalkBack_is_enabled and controls are focused. Use directional controls to move focus between application layout elements. Explore by Touch prompts: Verify that user interface controls that provide information (graphics or text) or allow user action have appropriate audio descriptions when Explore_by_Touch_is_enabled. There should be no regions where contents or controls do not provide an audio description. Touchable control sizes: All controls where a user can select or take an action must be a minimum of 48 dp (approximately 9mm) in length and width, as recommended by Android_Design. Gestures work with TalkBack enabled: Verify that app-specific gestures, such as zooming images, scrolling lists, swiping between pages or navigating carousel controls continue to work when TalkBack_is_enabled. If these gestures do not function, then an alternative interface for these actions must be provided. No audio-only feedback: Audio feedback must always have a secondary feedback mechanism to support users who are deaf or hard of hearing, for example: A sound alert for the arrival of a message should also be accompanied by a system Notification, haptic feedback (if available) or another visual alert. ***** Testing Recommendations ***** The following tests are recommended for ensuring the accessibility of your application. If you do not test these items, it may impact the overall accessibility and quality of your application. Repetitive audio prompting: Check that closely related controls (such as items with multiple components in a list) do not simply repeat the same audio prompt. For example, in a contacts list that contains a contact picture, written name and title, the prompts should not simply repeat "Bob Smith" for each item. Audio prompt overloading or underloading: Check that closely related controls provide an appropriate level of audio information that enables users to understand and act on a screen element. Too little or too much prompting can make it difficult to understand and use a control. ***** Special Cases and Considerations ***** The following list describes specific situations that should be tested to ensure an accessible app. Some, none or all of the cases described here may apply to your application. Be sure to review this list to find out if these special cases apply and take appropriate action. Review developer special cases and considerations: Review the list of special cases for accessibility development and test your application for the cases that apply. Prompts for controls that change function: Buttons or other controls that change function due to application context or workflow must provide audio prompts appropriate to their current function. For example, a button that changes function from play video to pause video should provide an audio prompt which is appropriate to its current state. Video playback and captioning: If the application provides video playback, verify that it supports captioning and subtitles to assist users who are deaf or hard of hearing. The video playback controls must clearly indicate if captioning is available for a video and provide a clear way of enabling captions. ***** Testing Accessibility Features ***** Testing of accessibility features such as TalkBack, Explore by Touch and accessibility Gestures requires setup of your testing device. This section describes how to enable these features for accessibility testing. **** Testing audible feedback **** Audible accessibility feedback features on Android devices provide audio prompts that speaks the screen content as you move around an application. By enabling these features on an Android device, you can test the experience of users with blindness or low-vision using your application. Audible feedback for users on Android is typically provided by TalkBack accessibility service and the Explore by Touch system feature. The TalkBack accessibility service comes preinstalled on most Android devices and can also be downloaded for free from Google_Play. *** Testing with TalkBack *** The TalkBack accessibility service works by speaking the contents of user interface controls as the user moves focus onto controls. This service should be enabled as part of testing focus navigation and audible prompts. To enable the TalkBack accessibility service: Launch the Settings application. Navigate to the Accessibility category and select it. Select Accessibility to enable it. Select TalkBack to enable it. Note: While TalkBack is the most available Android accessibility service for users with disabilities, other accessibility services are available and may be installed by users. For more information about using TalkBack, see TalkBack. *** Testing with Explore by Touch *** The Explore by Touch system feature is available on devices running Android 4.0 and later, and works by enabling a special accessibility mode that allows users to drag a finger around the interface of an application and hear the contents of the screen spoken. This feature does not require screen elements to be focused using an directional controller, but listens for hover events over user interface controls. To enable Explore by Touch: Launch the Settings application. Navigate to the Accessibility category and select it. Select the TalkBack to enable it. Note: On Android 4.1 (API Level 16) and higher, the system provides a popup message to enable Explore by Touch. On older versions, you must follow the step below. Return to the Accessibility category and select Explore by Touch to enable it. Note: You must turn on TalkBack first, otherwise this option is not available. For more information about using the Explore by Touch features, see Touch Exploration. **** Testing focus navigation **** Focus navigation is the use of directional controls to navigate between the individual user interface elements of an application in order to operate it. Users with limited vision or limited manual dexterity often use this mode of navigation instead of touch navigation. As part of accessibility testing, you should verify that your application can be operated using only directional controls. You can test navigation of your application using only focus controls, even if your test devices does not have a directional controller. The Android_Emulator provides a simulated directional controller that you can use to test navigation. You can also use a software-based directional controller, such as the one provided by the Eyes-Free_Keyboard to simulate use of a D-pad on a test device that does not have a physical D-pad. **** Testing gesture navigation **** Gesture navigation is an accessibility navigation mode that allows users to navigate Android devices and applications using specific gestures. This navigation mode is available on Android 4.1 (API Level 16) and higher. Note: Accessibility gestures provide a different navigation path than keyboards and D-pads. While gestures allow users to focus on nearly any on-screen content, keyboard and D-pad navigation only allow focus on input fields and buttons. To enable gesture navigation: * Enable both TalkBack and the Explore by Touch feature as described in the Testing_with_Explore_by_Touch. When both of these features are enabled, accessibility gestures are automatically enabled. * You can change gesture settings using Settings > Accessibility > TalkBack > Settings > Manage shortcut gestures. For more information about using Explore by Touch accessibility gestures, see Touch_Exploration. Note: Accessibility services other than TalkBack may map accessibility gestures to different user actions. If gestures are not producing the expected actions during testing, try disabling other accessibility services before proceeding.
TextView v = (TextView) LayoutInflater.from(parent.getContext()) .inflate(R.layout.my_text_view, parent, false);
View v = LayoutInflater.from(parent.getContext()) .inflate(R.layout.my_text_view, parent, false);
compile 'com.android.support:appcompat-v7:25.1.1' compile 'com.android.support:cardview-v7:25.1.1' compile 'com.android.support:recyclerview-v7:25.1.1'
compile 'com.android.support:appcompat-v7:25.1.0' compile 'com.android.support:cardview-v7:25.1.0' compile 'com.android.support:recyclerview-v7:25.1.0'