Papers
arxiv:2311.07427

Boolean Variation and Boolean Logic BackPropagation

Published on Nov 13, 2023
Authors:

Abstract

The notion of variation is introduced for the <PRE_TAG>Boolean set</POST_TAG> and based on which <PRE_TAG>Boolean logic backpropagation</POST_TAG> principle is developed. Using this concept, deep models can be built with weights and activations being <PRE_TAG>Boolean numbers</POST_TAG> and operated with Boolean logic instead of real arithmetic. In particular, Boolean deep models can be trained directly in the <PRE_TAG>Boolean domain</POST_TAG> without latent weights. No gradient but logic is synthesized and backpropagated through layers.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2311.07427 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2311.07427 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2311.07427 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.