I see online some people use ScalerConverterFactory.create() to deserialize Strings and other primitive types. Is this necessary or can Moshi and kotlinx serialization handle these primitive types?
It depends on whether you're using JSON. If your response body is unquoted, use the Scalar converter:
Hello I'm plain text
If it's quoted, use a JSON converter.
"Hello, I'm a JSON string"
Related
I am trying to parse the results of an API call which returns a unique first property.
{
"AlwaysDifferent12345": {
"fixedname1" : "ABC1",
"fixedname2" : "ABC2"
}
}
I am using retrofit2 and jackson/gson and cannot figure out how to cope with dynamic property names within the retrofit2 framework. The following works fine
data class AlwaysDifferentDTO(
#JsonProperty("AlwaysDifferent12345") val alwaysDifferentEntry: AlwaysDifferentEntry
)
I have tried
data class AlwaysDifferentDTO(
#JsonProperty
val response: Map<String, AlwaysDifferentEntry>
)
But this returns errors Can not instantiate value of type... The return value from the API is fixed i.e. map<string, object>.
I have read you can write a deserializer but it looks like I need to deserialize the whole object when all I want to do is just ignore the string associated with the response.
I have read
https://discuss.kotlinlang.org/t/set-dynamic-serializedname-annotation-for-gson-data-class/14758
and several other answers. Given unique properties names are quite common it would be nice to understand how people deal with this when using retrofit2
Thanks
Because the JSON doesn't have a 1-to-1 mapping Jackson can't map it automatically using annotations. You are going to need to make your own Deserializer.
In this tutorial you can learn how to create your own custom Deserializer for Jackson. https://www.baeldung.com/jackson-deserialization
In the tutorial you will see the first line under the deserialize function is
JsonNode node = jp.getCodec().readTree(jp);
using this line you can get the JSON node as a whole and once you have it you can call this function
JsonNode AlwaysDifferent12345Node = node.findParent("fixedname1");
Now that you have that node you can retrieve its value like shown in the rest of the tutorial. Once you have all the values you can return a new instance of the AlwaysDifferentDTO data class.
I have a problem with handling responses in YAML format in Retrofit. Until now I only handled the response body in JSON format or plain text. For these types, I always have prepared converters like gson, jackson for JSON and scalars for plain text. I found repository with all converters: retrofit-convertes.
In the documentation is a short mention that I need to create my own converter for this type.
If you need to communicate with an API that uses a content-format that Retrofit does not support out of the box (e.g. YAML, txt, custom format) or you wish to use a different library to implement an existing format, you can easily create your own converter. Create a class that extends the Converter.Factory class and pass in an instance when building your adapter.
Sadly, I don't see any tutorial on how to create such a converter. Is there any documentation explaining how to do this or is there any other option to handle such a case?
In your case you can use Jackon with yaml data formats.
Retrofit retrofit = new Retrofit.Builder()
.baseUrl("<your base url>")
.addConverterFactory(JacksonConverterFactory.create(new ObjectMapper(new YAMLFactory())))
.build();
For more informating check the following links:
https://github.com/square/retrofit/tree/master/retrofit-converters/jackson
https://github.com/FasterXML/jackson-dataformats-text/tree/master/yaml
I am trying to use GSON in order to parse a JSON that include some classes and fields that need to be excluded. Do I have to create classes for such objects, and include such fields in classes I create?
As it take Class<object> classOfT as parameter so we have to pass parameter, but if you dont want to make your custom class you can use it by this way.
Gson gson = new Gson();
gson.fromJson("Response Json String", Object.class);
and you can play with that object in many ways.
You can use #Expose annotation for your fields with serialize and deserializeparameters to false
Just don't add the field to the class and ignore it. There is no need to use all input, even with auto-mapping. Whatever has no #SerializedName annotation will not be mapped- #Expose also controls that. But the actual beauty of GSON is parsing such nested nodes to classes of various types.
just see: #SerializedName, #Expose.
How do I serialize a Json Array like the the picture I am going to upload? Sorry If this question may be repetitive but I cannot figure out how to do it in my use case.
For more context here are some photos.
https://imgur.com/a/NvbbnMI
I see you're using Retrofit, so it has to do it automatically. Just change return type to:
Call<List<petProfile>>
If your Retrofit is not doing it, add ->
.addConverterFactory(GsonConverterFactory.create())
after Retrofit.Builder()
there are a number of libs handling serialize and deserialize json.
such as Gson and JsonSlurpper.
Here you can see the Gson example:
new Gson().toJson(/*place your POJO here*/); // serialize
new Gson().fromJson(/*json object*/, /*class to convert to*/); //deserialize
Previously I was receiving the response like this:
I was parsing it like: Call<List<MyObject>> getList();
But now there are some new elements were added and the response changed to:
How to parse this object now? I searched my could not find any solutions.
This is how I am setting up my client.
This is the json object which i recieve as a response:
{"map":{"01":{"F":".","E":".","D":null,"C":null,"B":".","A":"."},"02":{"F":".","E":".","D":null,"C":null,"B":"Z","A":"."},"03":{"F":"A","E":"A","D":null,"C":null,"B":"A","A":"A"},"board":false,"type":{"num":"TT334","board":"WW","date":"31MAR","route":"AWETSW","pcount":""}}}
I dont
There are two potential solutions:
You create a DTO. Gson will ignore fields you don't map in your dto. Your json doesn't use a list it is entirely objects.
You manually parse the json using Gson's JsonReader
You can use a mixture of DTOs and manual parsing. I have done this for large json datasets and inconsistent datasets.