I have created a small test app for this issue here: (https://github.com/Winghin2517/EpoxyExample2).
I would like to pass a list of objects into the epoxy controller so that I can generate a graph. I have however encountered this error when building the app:
error: Epoxy Processor Exception: Type in Iterable does not implement
hashCode. Type: kwaai.com.exampleepoxy_hashcodeequals.GraphData (View
Prop {view='HeaderView', name='setLineChart',
type=java.util.LinkedList})
Epoxy requires every model attribute to implement equals and hashCode
so that changes in the model can be tracked. If you want the attribute
to be excluded, use the option 'DoNotHash'. If you want to ignore this
warning use the option 'IgnoreRequireHashCode'
I think it is because I using the #ModelProp on a List of Objects (LinkedList of GraphDataFeed) and not on a primitive type as per the example app from Epoxy.
#ModelProp
public void setLineChart(LinkedList<GraphData> graphDataFeed) { }
So I folllowed the options and modified my #ModelProp to reflect this:
#ModelProp(options = ModelProp.Option.IgnoreRequireHashCode)
After the change the app builds and runs correctly. You can see the graph below.
However, I do not want to ignore the attribute as I understand Epoxy uses Diffing to update the models in the recyclerview: https://github.com/airbnb/epoxy/wiki/Diffing
Ignoring the attribute might mean that my models do not get updated correctly in the recyclerview.
In the guidance material here (https://github.com/airbnb/epoxy/wiki/Epoxy-Models#annotations), I see it says:
A model's state is determined by its equals and hashCode
implementations, which is based on the value of all of the model's
properties.
This state is used in diffing to determine when a model has changed so
Epoxy can update the view.
These methods are generated so you don't have to created them
manually.
Why are these methods not generated for me then and if they are not generated, how do I generate these methods myself to get rid of the error?
Your GraphData class needs to implement equals and hashcode. It says this right in the error message you copied
Type in Iterable does not implement hashCode. Type: kwaai.com.exampleepoxy_hashcodeequals.GraphData
Related
I am not able to clearly understand what is typed array in kotlin. I have seen the fucntion toTypedArray in kotlin. but did not see any proper definition of it like what exactly it does. Can anyone please explain with an example.
Thanks
Arrays are generic data structures because they can contain different types of elements. You can have Array<Int> or Array<String> for instance.
There is no separate concept of "typed" array. The reason for the name of toTypedArray is (I guess) to distiguish it from toArray() which returns an Array<Any?> (without useful type information about its elements, because everything is a Any? in Kotlin).
The reason why those 2 exist is because arrays on the JVM cannot be created without knowing the element type. This means that, in general, you cannot create an arbitrary array generically because generics are erased at runtime so you wouldn't actually know the correct element type at that time. This is why the simple toArray method either returns Array<Any?> or takes an extra array argument. The extra argument allows to either avoid creating the destination array, or at least provides sufficient type information at runtime to create an array of the same type.
In Kotlin, we can go one step further and actually use reified types to use information that we have at compile time to generate more specific code, such as code that create an array of a specific type (not generically, but directly with the correct element type based on the call site information). This is what toTypedArray does by reifying its type parameter.
Whenever we want to inflate a view or get a resource we have to cast it in run-time. views, for example, are used like so:
In the past, we would have needed to cast it locally
(RelativeLayout) findViewById(R.id.my_relative_layout_view)
Now, we use generics
findViewById<RelativeLayout>(R.id.my_relative_layout_view)
my question is why doesn't the compiler(or whoever generates the R class) doesn't also keep some kind of a reference to the type of the element(doesn't matter if it's a string or an int or any other type) that way casting problems should not occur
We cannot really speculate on that, that would be a design choice.
It might be that they wanted to avoid bloating the APK. Every ID would need a full package name to the class. So would each ID in android.R too. Since R is packaged in every APK.
Solutions
However, if you are using Kotlin, you can even do away with the generics check. Kotlin will determine it automatically.
val view = findViewById(R.id.my_relative_layout_view)
view.method()
Or event simpler, if you use synthetics:
my_relative_layout_view.method()
Also, if you are using data bindings, you can just access it like this:
binding.my_relative_layout_view.method()
TLDR: How to use Variables from frozen tensorflow graphs on Android?
1. What I want to do
I have a Tensorflow model that keeps an internal state in multiple variables, created with: state_var = tf.Variable(tf.zeros(shape, dtype=tf.float32), name='state', trainable=False).
This state is modified during inference:
tf.assign(state_var, new_value)
I now want to deploy the model on Android. I was able to make the Tensorflow example App run. There, a frozen model is loaded, which works fine.
2. Restoring variables from frozen graph does not work
However, when you freeze a graph using the freeze_graph script, all Variables are converted to constants. This is fine for weights of the network, but not for the internal state. The inference fails with the following message. I interpret this as "assign does not work on constant tensors"
java.lang.RuntimeException: Failed to load model from 'file:///android_asset/model.pb'
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.<init>(TensorFlowInferenceInterface.java:113)
...
Caused by: java.io.IOException: Not a valid TensorFlow Graph serialization: Input 0 of node layer_1/Assign was passed float from layer_1/state:0 incompatible with expected float_ref.
Luckily, you can blacklist Variables from being converted to constants. However, this also doesn't work because the frozen graph now contains uninitialized variables.
java.lang.IllegalStateException: Attempting to use uninitialized value layer_7/state
3. Restoring SavedModel does not work on Android
One last version I have tried is to use the SavedModel format which should contain both, a frozen graph and the variables. Unfortunately, calling the restore method does not work on Android.
SavedModelBundle bundle = SavedModelBundle.load(modelFilename, modelTag);
// produces error:
E/AndroidRuntime: FATAL EXCEPTION: main
Process: org.tensorflow.demo, PID: 27451
java.lang.UnsupportedOperationException: Loading a SavedModel is not supported in Android. File a bug at https://github.com/tensorflow/tensorflow/issues if this feature is important to you at org.tensorflow.SavedModelBundle.load(Native Method)
4. How can I make this work?
I don't know what else I can try. Here's what I would imagine, but I don't know how to make it work:
Figure out a way to initialize variables on Android
Figure out a different way to freeze the model, so that maybe the initializer op is also part of the frozen graph and can be run from Android
Find out if/how RNNs/LSTMs are implemented internally, because these should also have the same requirement of using variables during inference (and I assume LSTMs to be able to be deployed on Android).
???
I have solved this myself by going down a different route. To the best of my knowledge, the "variable" concept cannot be used in the same way on Android as I was used to in Python (e.g. you cannot initialize variables and then have an internal state of the network be updated during inference).
Instead, you can use placehlder and output nodes to preserve the state inside your Java code and feed it to the network on every inference call.
replace all tf.Variable occurances with tf.placeholder. The shape stays the same.
I also defined an additional node used to read the output. (Maybe you can simply read the placeholder itself, I haven't tried that.) tf.identity(inputs, name='state_output')
During inference on Android, you then feed the initial state into the network.
float[] values = {0, 0, 0, ...}; // zeros of the correct shape
inferenceInterface.feed('state', values, ...);
After inference, you read the resulting internal state of the network
float[] values = new float[output_shape];
inferenceInterface.fetch('state_output', values);
You then remember this output in Java to pass it into the 'state' placeholder for the next inference call.
What are the steps to create my own data type when using arrow.
It's simple to use something like Option with the provided extension constructors like Some(data) or None. However, how can I create my own data type that has functional operators like map() or flatMap()?
The steps to create data types in Arrow that conform to Type classes like Functor and therefore provide methods like map is outlined here:
Enable higher kinded type emulation. https://arrow-kt.io/docs/patterns/glossary/#higher-kinds
Implement the type class instance
https://arrow-kt.io/docs/patterns/glossary/#using-higher-kinds-with-typeclasses
In the two links above there is an example that uses ListK wrapping the std lib List. What the documentation example does not mention is that in order to expand the extensions which Functor would add over ListK including map, lift, etc as defined in the Functor interface it requires kapt and arrow meta.
kapt "io.arrow-kt:arrow-meta:$arrow_version"
Arrow meta is in charge of expanding Higher Kinds and Extensions for type class instances. One limitation in the current expansion is that if you plan to use both #higherkind and #extension in the same module it won't work due to the order in which kapt processes. For that you would need to have data types in one module and extensions in a different one. This is actually good practice and what we follow in Arrow because it allows user to import data types a la carte when they don't want the extensions.
If I'm understanding your question correctly:
https://arrow-kt.io/docs/patterns/glossary/
Note that the annotation processors should be able to generate the typeclass instances for you. But fundamentally you just need to decide which typeclasses your datatype will support and provide implementations for those typeclasses. (Note that the typeclasses form an inheritance hierarchy so (e.g.) if you implement Monad you (may) need to implement Functor.)
I'm using SimpleXml on Android to deserialize an xml which I have no control over. Now, every time the xml changes, it brakes my app because I don't have the new element defined in my object class. Is there a way I could specify SimpleXML just to ignore those missmaps? Looked at the documentation and can't find anything to help me solve it.
I'm pretty sure you can get around the strict mapping by replacing your regular #Root declaration with #Root(strict=false), which will eliminate the requirement that every element should match a field in your class definition. More precisely, from the documentation:
This is used to determine whether the object represented should be
parsed in a strict manner. Strict parsing requires that each element
and attribute in the XML document match a field in the class schema.
If an element or attribute does not match a field then the parsing
fails with an exception. Setting strict parsing to false allows
details within the source XML document to be skipped during
deserialization.
There's also an example given in the list of tutorials on the Simple XML project site.
You can specify strict mode to be disabled for all tags for a particular read by adding in "false" as the last parameter. Also from their documentation:
Should there be more than a single object that requires loose mapping then using the Root annotation might not be the ideal solution. In such a scenario the persister itself can be asked to perform loose mapping. Simply pass a boolean to the read method indicating the type of mapping required. By default the persister uses strict mapping, which can be overridden on an object by object basis using the Root annotation, as shown in the above example. However, this default can be overridden as can be seen in the code snippet below.
Contact contact = serializer.read(Contact.class, source, false);