I am trying to the change the encoding bit rate of the video recording on Android using MediaRecorder.setVideoEncodingBitRate(int).
I looked in the android documentation and it states this method to set/change the bit rate but when I try to use this method I am getting setVideoEncodingBitrRate(int) is not defined in package MediaRecorder.
Why it is so?
I suggest you should check that which API version you are using
setVideoEncodingBitRate() just come on API v8 or Android 2.1
If you use version less than that it would not be available :D
Also you could use it like this
webCamRecorder = new MediaRecorder();
if (target_holder == null)
return;
webCamRecorder.setPreviewDisplay(target_holder.getSurface());
webCamRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
webCamRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
webCamRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
webCamRecorder.setAudioEncodingBitRate(196608);
webCamRecorder.setVideoSize(640, 480);
webCamRecorder.setVideoFrameRate(30);
webCamRecorder.setVideoEncodingBitRate(15000000);
webCamRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
webCamRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
webCamRecorder.setOutputFile("your location to save");
setVideoEncodingBitRate is an instance method, seems that you are trying to call it as a static method (MediaRecorder.setVideoEncodingBitRate(int)), instead, call it from the MediaRecorder object.
MediaRecorder mr = new MediaRecorder();
mr.setVideoEncodingBitRate(someint);
Also, did you import android.media.MediaRecorder ?
Related
Im building a react-native application.
Im trying to meter the current sound level (in decibel).
Libraries in use: react-native-audio and react-native-sound.
There is anybody familiar with this feature?
Thank you.
With the react-native-audio library, you can get the count for the only iOS.
For the currentMetering in Android, you need to customize in a native module. I have updated it add the following things in your package.json file instead of your code. You will get the currentMetering count for the Android as well as iOS.
"react-native-audio": "git+https://github.com/Harsh2402/react-native-audio.git",
You can use react-native-audio currentMetering value - in order to get the sound level in real-time.
First, you will have to initialise your recorder (which i will assume youve done). I use prepareRecordingAtPath in a similar way to below
AudioRecorder.prepareRecordingAtPath(audioPath, {
SampleRate: 22050,
Channels: 1,
AudioQuality: "Low",
AudioEncoding: "aac",
MeteringEnabled: true
});
then once you've called AudioRecorder.startRecording(); (Note you have access to .pause() and .stop() methods also
Now for handling the audio level, you are going to have to retrieve the data that is returned by the onProgress method. From what i remember, there should be some currentMetering value that you can access. Note that defining this behaviour will trigger the action every time a different decibel reading is retrieved. Like so
AudioRecorder.onProgress = data => {
let decibels = Math.floor(data.currentMetering);
//DO STUFF
};
Hope this helps,
I found this Question but it doesn't work for me. (Nothing happens. Tested on Wiko Rainbow Jam)
Android - Camera2 : The easiest way to turn on the torch light
My App have to run on min. API Level 16! Is there an SupportCameraManager or an Library (under Apache), which can I use?
You Can Use this.
Initialize the NoobCameraManager singleton.
NoobCameraManager.getInstance().init(this);
You can optionally set the Log Level for debug logging. Logging uses LumberJack library. The default LogLevel is LogLevel.None
NoobCameraManager.getInstance().init(this, LogLevel.Verbose);
After that you just need to call the singleton to turn on or off the camera flash.
NoobCameraManager.getInstance().turnOnFlash();
NoobCameraManager.getInstance().turnOffFlash();
You can take care of the runtime permission to access Camera yourself or can allow the library to do it for you
NoobCameraManager.getInstance().takePermissions();
Note: The library will take permissions, if you haven't already, even without calling takePermissions() explicitly. This behavior may change in future.
It's easy to toggle Flash too
NoobCameraManager.getInstance().toggleFlash();
It's a good practice to release all the resources, once you're done.
NoobCameraManager.getInstance().release();
SOLUTION: The problem was that I only got the camera's parameters when turning on the light and turning off the light. This must apparently also be done in the constructor or in the overwriting onStart method.
#Override
protected void onStart() {
super.onStart();
camera = Camera.open(); //Also Call this
params = camera.getParameters(); //and this, in the Constructor
}
as fields:
private Camera camera;
Parameters params;
And then you can start the flashlight with these snippets of code: Android - Camera2 : The easiest way to turn on the torch light
I'm trying to build a QR Code reader following this tutorial
http://code.tutsplus.com/tutorials/android-sdk-create-a-barcode-reader--mobile-17162
I managed to get everything working, except that I need the camera to be the front camera of my device instead of the rear camera. I can't find any place in the tutorial that allows me to change this. I tried following this answer, but I still could not get it to work.
Mainly, my issue is with importing the library. I get the following error.
operator is not allowed for source level below 1.7
When I set my compiler settings to 1.7, I get this
Android requires compiler compliance level 5.0 or 6.0. Found '1.7' instead
I'm not exactly very proficient with Android and I apologize if it might not be a good question.
So, any way for me to use ZXing with the front camera in my app? Any links?
Thank you very much.
The source code uses Java 7. Android does not require Java <= 6. You can see that the build provided in the project happily feeds Java 7 bytecode to dex and produces a working app. I am not sure what tool you are using that suggests otherwise. Maybe it is old.
You should not need to copy and compile the project's code though. Why is that necessary? use the core.jar file.
You don't need any of this to use the front camera. Just invoke by Intent (https://github.com/zxing/zxing/wiki/Scanning-Via-Intent) and set extra SCAN_CAMERA_ID to the ID of the camera you want -- usually 1 for the front one.
Example:
intent.putExtra("SCAN_MODE", "QR_CODE_MODE");
intent.putExtra("SCAN_CAMERA_ID", 1);
If you use IntentIntegrator, you can use setCameraId() to specify the front camera:
IntentIntegrator integrator = new IntentIntegrator(yourActivity);
integrator.setCameraId(1);
integrator.initiateScan();
After quite a few search i found how to use the front camera. There is this piece of code in com.google.zxing.client.android.camera.CameraConfigurationManager.java
public void openDriver(SurfaceHolder holder) throws IOException {
Camera theCamera = camera;
if (theCamera == null) {
theCamera = Camera.open();
if (theCamera == null) {
throw new IOException();
}
camera = theCamera;
}
theCamera.setPreviewDisplay(holder);
jus change the Camera.open() to Camera.open(1)
worked fine for me
VOICE_CALL, VOICE_DOWNLINK ,VOICE_UPLINK
not working on android 4.0 but working on android 2.3 (Actual Device),I have uploaded a dummy project to record all outgoing call so that you can see it for your self
http://www.mediafire.com/?img6dg5y9ri5c7rrtcajwc5ycgpo2nf
you just have to change audioSource = MediaRecorder.AudioSource.MIC; to audioSource = MediaRecorder.AudioSource.VOICE_CALL; on line 118 in TService.java
If you come across any error, tell me. Any suggestion related to it will be accepted.
After a lot of search I Found that Some Manufactures have closed the access to such function because call recording is not allowed in some countries. If anyone finds such question and get the solution some other way then post it over here it may be helpful to many because many people are have the same issue.
Try to use MediaRecorder.AudioSource.VOICE_RECOGNITION. I had the same problem - ASUS Transformer uses microphone near the back camera by default and audio is very silent in this case. VOICE_CALL doesn't work on this tablet and I have tried VOICE_RECOGNITION - in that case it uses front microphone and audio volume is OK.
OK, in my case this code (thank you eyal!) worked for Samsung Galaxy Note 6:
String manufacturer = Build.MANUFACTURER;
if (manufacturer.toLowerCase().contains("samsung")) {
recorder.setAudioSource(MediaRecorder.AudioSource.VOICE_COMMUNICATION);
} else {
recorder.setAudioSource(MediaRecorder.AudioSource.VOICE_CALL);
}
you try to add this,it may be
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
// TODO Auto-generated method stub
mMediaRecorder.start();
}
}, 1000);
I have seen lots of examples with Android's VideoVIew API being used to stream data from an external server into a device(VideoView internally uses an RTP and RTSP stack to receive data).
However, there are very few discussion on possibilities of using Android's internal RTSP and RTP stacks for achieving server capabilities, i.e making an android device act as a Streaming server and stream media out .
Is it possible ?
And where inside the Android native code can I start digging in to achieve such functionality ?
Would appreciate details .
Thanks
Amit
A bit late, but:
You can set the MediaRecorder output format to "7". This is defined in
/framework/base/media/java/android/media/MediaRecorder.java
check that for details
as:
/** #hide Stream over a socket, limited to a single stream */
public static final int OUTPUT_FORMAT_RTP_AVP = 7;
The destination is controllable via
setprop streaming.ip
and
setprop streaming.port
The AV data will then be streamed to the given destination address.
The RTP code (native) itself lives in the
/frameworks/base/media/libstagefright/rtsp directory.
Happy code digging
There is also possibility to use libstreaming library (https://github.com/fyhertz/libstreaming)
The documentation on Github gives you the example of how do you set up the server, but basically you need to add net.majorkernelpanic.streaming.gl.SurfaceView to your Layout
<net.majorkernelpanic.streaming.gl.SurfaceView
android:id="#+id/surface"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
Add this to your manifest
<service android:name="net.majorkernelpanic.streaming.rtsp.RtspServer"/>
Include libstreaming library. If you are working with a newer version of Android Studio you need to clone the libstreaming as a separate project and import module. Afterwards, it is necessary to run build on the build.gradle in libstreaming. Then you can work with this library.
The last step is to create an Activity. Simplest possible looks like this:
public class RemoteStreamingActivity extends Activity {
private SurfaceView mSurfaceView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_remote_streaming);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
handleGestures();
mSurfaceView = (SurfaceView) findViewById(R.id.surface);
SessionBuilder.getInstance()
.setSurfaceView(mSurfaceView)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_NONE)
.setVideoEncoder(SessionBuilder.VIDEO_H264);
this.startService(new Intent(this,RtspServer.class));
}
#Override
public void onDestroy() {
super.onDestroy();
this.stopService(new Intent(this, RtspServer.class));
}
}
If you want to test whether the rstp server is running you can try using VLC and connect via URL: rstp://{ipAddressOfYourDevice}:8086?h264=200-20-320-240