I've a face.obj and face.mtl file. But libgdx doesn't allow me to render face.mtl it just takes face.mtl that's why I'm not able show texture on it. So how should I load face.mtl in directly face.obj file.
Thanks.
Indeed, LibGDX was not loading .MTL files before, and you were expected to manually load texture assets for your .OBJ model.
Now (Oct. 2012) it does. (As seen here https://github.com/libgdx/libgdx/pull/71)
Related
I have been working on the ARToolkit sdk for android since some time.
In the ARToolkit SDK, I have worked on ARBaseLib and ARSimpleNativeCarsProj and implemented successfully. But I am trying to add external 3d Objects(.obj and .mtl) and I am unable to render the new object files.
I have also looked into the source code provided in this link,
https://github.com/kosiara/artoolkit-android-studio-example
but the problem here is the 3D object(Cube) has been created in using the draw(), openGL libraries function, instead i would want to add an external 3D Object.
More Explanation:
Okey, SimpleNativeCarsProj comes along with two 3D Objects(.OBJ and .MTL) in the assets/Data folder. CASE1 I tried replacing the existing 3Dobject with another 3D Objects, App Crashes on the launch CASE2 As I worked around a little, these files are pushed to the cache folder on the app load, I invalidated the caches and restarted android studio, rebuilt and ran the app, still the app crashes on the launch. Technically I am unable to Replace/Delete/Add another 3D Object files to SimpleNativeCarsProject
Any headsup would be appreciated.
Convert your FBX files
The Encoder works wit FBX (.fbx) files. We recommend using FBX wherever possible as tools support for FBX is widely available.
http://www.wikitude.com/products/wikitude-sdk-features/wikitude-3d-encoder/
Give ArToolKitJpctBaseLib a try. It is a wrapper on ArToolKit + jPCT-AE (a 3d Engine on Android) aims to simplify creation of AR apps for Android.
I am working in a game which is using AndEngine to develop.I successfully use .jpg and .png files in this.But I face too much problems when I trying to use Maya Animated files (which are in .max extension) and also .obj files. Can anyone please suggest me how can I use .max files. Or at least .obj files in my programs.Hope everyone understand my problem clearly. Thanks in Advance
I am not entirely sure: But AndEngine is a 2D game-engine while .max and .obj files usually describe 3D object/animations. For example obj files can contain vertexes, while AndEngine is not capable of using vertexes directly (although, some extensions might).
What I am sure of: you can always make little screenshots of your Maya animation and put them all in a row in a png file and use the png image instead.
I using min3d example to show object 3ds. But target file = res/raw and texture = assets. Can i replace target file in my external storage? Thanks!
yes, here is a framework based on min3d to load from sd https://github.com/MasDennis/Rajawali/tree/master/src/rajawali
Or if you do not want to use another library, look up min3d issue. Some people have proposed a solution
http://code.google.com/p/min3d/issues/detail?id=40
I'm creating a bike racing game for Android. I am planning to create models in Blender, export them to .obj format and then render them on the device. I'm using the min3D framework to do the obj parsing and rendering. I followed the tutorial in this page for parsing the obj file and to render it on the screen. When I tried simple object like a cube, everything works fine and it is rendered perfectly on the screen. But when I tried to load a simple bike model I downloaded from the web, it doesn't work. The app crashes and I force stop it. When I saw LogCat, it was either a java.lang.NumberFormatException, or java.lang.NullPointerException or a resource not found exception inside the parse() method. I have no clue why this happens.
I have the following doubts about where it possibly could have gone wrong:
1) According to the given tutorial, I changed the file names from .obj and .mtl to _obj and _mtl. But there was a line in the obj file that has the name of the .mtl file. I changed it to _mtl. Still it dint work. Is there something similar I need to do anywhere else? Do I need to modify any of the files in any way?
2) Sometimes I find that models created in Blender 2.49 are parsed and rendered but the models created in Blender 2.6 are causing this trouble. Also min3D was created during the older versions of Blender. So should I use only Blender 2.49 for creating the models and rendering them?
P.S: I'm completely new to graphics so I'm fighting a lot with this without giving up. Any help would be greatly appreciated. :)
I believe this is because of missing Texture. Did you copy the texture image into res/drawable folder?
This question is verging on the subjective, but not quite: I hope to hear of others' experience with this (or well reasoned answers should be possible too).
I'm writing an Android OpenGL ES 1.1 app. It uses NDK: the core OpenGL rendering is in native code, much like the san-angeles OpenGL sample that comes with NDK. I'm using OpenGL textures that are read into the app as JPGs. I know how this is done, but I'm wondering is the most efficient place to do it (by efficient, I mean quick execution).
To illustrate, here's a few scenarios for binding OpenGL textures based on the input JPGs in my project:
1) The Java code makes a bitmap from the JPG (i.e. decompresses it), and bind and loads the texture using the Java OpenGL bindings. The texture ids are passed through NDK to native code so that native code can use them for texture mapping.
2) The Java code makes a bitmap from the JPG (i.e. decompresses it), and passes the raw image data through NDK to native code, which then binds and loads the texture from that raw image data.
3) The Java code passes through the JPG data (compressed) through NDK to the native code, which uncompresses the bitmap and then binds and loads the texture.
I'm using NDK and native code not for speed reasons, but for portability reasons -- I want my core OpenGL code to work on iPhone and Android, much as in the san-angeles OpenGL sample that comes with NDK. I'm aware that native isn't always necessarily faster than Java code.
** Native decompression of a jpeg will be faster than a Java implementation assuming that both use the same algorithm. The difference, however, will not be substantial. **
For portability I keep as much code as possible native. That way when I move between platforms I have very little to do to create the port. I used SOIL to decompress JPG's in native code and I found performance to be comparable to the iOS version running the same code. Certainly android doesn't seem any slower.
Regarding assets I found that ZIP decompress was very slow indeed. Changing the extension in assets to MP3 sped loading considerably. MP3's thankfully don't get compressed.
When the Android Package is made, all files in the assets folder are placed into the APK. The APK is a zip file containing the application, resources and assets. When the package is made some files are added to the zip file without compression. One of these is MP3. By renaming your files to MP3 they are added uncompressed and therefore load much faster.
My subjective answer to your question would be
4) Do all of your texture loading and asset management in Native Code using the same code you use for iOS. To decompress the jpegs use libjpeg-turbo or SOIL, SOIL is easier, libjpeg-turbo is very fast. Access your assets by using libzip and libz making sure that you add an MP3 extension onto every file to prevent zip compression.
SOIL http://www.lonesock.net/soil.html
LIBZIP http://www.nih.at/libzip/
LIBZ available in NDK
libjpeg-turbo http://git.linaro.org/gitweb?p=people/tomgall/libjpeg-turbo/libjpeg-turbo.git;a=summary