I am trying to record audio using an app that runs WebView. For example, I load the app in the URL https://online-voice-recorder.com/ and records some audio over the mic. I have given enough permissions and able to record and replay the audio. On Windows 10, it works perfectly. But on my Android Oreo 8.1.0 running Chrom version 80, this audio is fully distorted. I tested it another device and the audio is partially legible.
Is this a standard behavior?
What I have tried to see what is the problem: I wrote a backend netcore3.1 app to play the audio and it plays the same distorted sound.
I have also tried using default sampling rates and custom and in either case, the audio remains distorted.
Related
Has anyone been able to successfully play a HD video using Android Things on a Raspberry Pi? If so was there anything special you had to do to get it to work? If not does any one know why it isn't working?
I made a simple video player that plays a local video from disk. The app runs fine on a phone but with android things maximum frames are getting dropped.
I am using ExoPlayer for the video playback and the same issue is observed when Media Player is used.Hardware acceleration is also enabled in the manifest file.
I am developing an app with Android Studio 2.1 and I would like to record a video of an usage example, with sound.
I have already recorded some videos, but always without sound. As the sound in my app is a key feature, I would like to have the video with the audio output of my app.
Is it possible? In case it is, how could I use it?
Thanks!
Maybe you should try using external app for recording both screen and sound. I have dig in just few mins and i have found some apps.
Here I left link to a top http://www.androidauthority.com/best-screen-recording-apps-600838/
The last one seems to be the one you are looking for (SCR Screen Recorder, which requires root), hope it works!
Other option could be to use a male to male audio cable and record the audio transmitted to the pc.
I'm trying to use a Samsung Galaxy S6 Edge Android 5.1.1 (rooted) as an basic audio/video live stream equipment but I can't route audio from an external USB card Behringer UCA202 Class Compliant 1.0 to default phone camera application.
Default camera app has the option to create a live event on Youtube and works great, but unfortunately only can get audio from internal speaker as I expected. The sound card is working and I can get sound using others audio record apps.
Any option to change default Audio Input from internal mic to USB Audio IN? Maybe AlsaMixer is the way to go? I look for some kind of virtual mixers on PlayStore but unsuccessfully.
The only post I see asking something similar is this one:
Change Android Audio Record Default Input Source
but it's from three years ago :(
I'm developing an android app to send the live Video & Audio stream to other PC, the app can capture the camera and mic and send the live stream and use VLC player to play it, it works very well in my htc s710e(android version 4.0.4), but in other mobile it's very choppy. I tested many mobiles, some mobile's hardware is higher than s710e, and some is lower than it, but all of them are very choppy.
I debugged it for too many time and found I cannot modify the frame rate of video, although I set the frame rate is 10 or 15, but the live video in vlc shows the frame rate is 30(H.263).
So how can I modify the frame rate? hope someone can helps me, thank you.
Ok. So there are a bagillion different Android devices. I have a video streaming service, works wonderfully for iOS. My app has a live video feature and a saved video clip playback feature (which streams to the device too). I've run some tests on different Android devices and get a whole bunch of different playback results. I am using a 640x480 h.264 base profile video. Streaming that video works only on some devices. For other devices, that same video stream can be made to stream at low resolution and that works on some devices, but still not others. The high profile streaming goes through http://www.wowzamedia.com/ (rtsp) and doesn't work on any Android device (but works on iPhone). The lowest and worst option is Motion JPEG, which works on all tested devices so far.
So my question is, how can I figure out (without having to test every device out on the market) if the device will play: 640x480 h.264 base profile - if that wont work then play the low resolution video - if that doesn't work, default to Motion JPEG.
Also, any idea why my rtsp transcoded through wowza works on the iPhone but not on any Android device (not even the Motorola Atrix)?
Streaming on android is an absolute mess. Most devices don't support anything higher than Baseline 3.0. If you encode for iPhone 3, it should generally work via RTSP. Newer versions of android support HLS, but it's hit or miss and largely dependent on specific devices.
I resolved this problem. Check RTP-realization in your streaming service and x264 profile. My RTSP-server works fine on 90% devices.
p.s
Some video frameworks in different Android versions can implement RTP and RTSP protocols with some differences.
These are some of the links/issues which I have come across, while trying to make streaming work in varied devices.
MediaPlayer seekTo doesn't work for streams
MediaPlayer resets position to 0 when started after seek to a different position
MediaPlayer seekTo inconsistently plays songs from beginning
Basic streaming audio works in 2.1 but not in 2.2
MediaPlayer.seekTo() does not work for unbuffered position
Streaming video when seek back buffering start again in videoView/Mediaplayer
Even the big shots in stackoverflow are wondering about this
If you want just streaming without seeking (which is lame), this can be achieved. But then if you receive a call while you are watching, you will end up from the start.