Datasets:

Modalities:
Text
Formats:
webdataset
Size:
< 1K
ArXiv:
Libraries:
Datasets
WebDataset
huyang0905 commited on
Commit
f467c2b
·
1 Parent(s): 2a5babd

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +40 -26
README.md CHANGED
@@ -1,47 +1,61 @@
1
- # DataBack: Dataset of CNF Formulas and Backbone Variable Phases
2
 
3
  ## What is DataBack
4
- `DataBack` is a dataset that consists of SAT CNF formulas, each labeled with the phases of its backbone variables. Within `DataBack`, there are two distinct subsets: the pre-training set, named `DataBack-PT`, and the fine-tuning set, named `DataBack-FT`. The state-of-the-art backbone extractor, `CadiBack`, has been employed to obtain the backbone labels. Due to the increased complexity of the fine-tuning formulas, we have allocated a timeout of 1,000 seconds for the pre-training formulas and 5,000 seconds for the fine-tuning ones.
5
 
6
- The `DataBack` dataset has been employed to both pre-train and fine-tune our `NeuroBack` model, which has demonstrated significant improvements in SAT solving efficiency. For an in-depth exploration of `DataBack`, please refer to [our `NeuroBack` paper](https://arxiv.org/pdf/2110.14053.pdf).
7
 
8
- ## Authors
9
- Wenxi Wang, Yang Hu, Mohit Tiwari, Sarfraz Khurshid, Ken McMillan, Risto Miikkulainen
10
 
11
- ## Publication
12
- If you use `DataBack` in your research, please kindly cite [our paper](https://arxiv.org/pdf/2110.14053.pdf):
 
 
13
  ```bib
14
  @article{wang2023neuroback,
15
- author = {Wang, Wenxi and
16
- Hu, Yang and
17
- Tiwari, Mohit and
18
- Khurshid, Sarfraz and
19
- McMillan, Kenneth L. and
20
- Miikkulainen, Risto},
21
- title = {NeuroBack: Improving CDCL SAT Solving using Graph Neural Networks},
22
  journal={arXiv preprint arXiv:2110.14053},
23
  year={2021}
24
  }
25
  ```
 
 
 
 
 
 
 
 
 
 
 
 
26
  ## Directory Structure
27
  ```
28
- |- original # Original CNFs and their backbone variable phases
29
- | |- cnf_pt.tar.gz # CNFs for model pre-training
30
- | |- bb_pt.tar.gz # Backbone phases for pre-training CNFs
31
- | |- cnf_ft.tar.gz # CNFs for model fine-tuning
32
- | |- bb_ft.tar.gz # Backbone phases for fine-tuning CNFs
33
  |
34
- |- dual # Dual CNFs and their backbone variable phases
35
- | |- dual_cnf_pt.tar.gz # Dual CNFs for model pre-training
36
  | |- dual_bb_pt.tar.gz # Backbone phases for dual pre-training CNFs
37
- | |- dual_cnf_ft.tar.gz # Dual CNFs for model fine-tuning
38
  | |- dual_bb_ft.tar.gz # Backbone phases for dual fine-tuning CNFs
39
  ```
40
 
41
  ## File Naming Convention
42
- In the original directory, each CNF tar file contains compressed CNF files named: `[cnf_name].[compression_format]`, where `[compression_format]` could be bz2, lzma, xz, gz, etc. Correspondingly, each backbone tar file comprises compressed backbone files named: `[cnf_name].backbone.xz`. It is important to note that a compressed CNF file will always share its `[cnf_name]` with its associated compressed backbone file.
43
 
44
- In the dual directory, the naming convention remains consistent, but with an added `d_` prefix for each compressed CNF or backbone file to indicate it pertains to a dual CNF formula.
45
 
46
- ## Contact
47
- Wenxi Wang (wenxiw@utexas.edu), Yang Hu (huyang@utexas.edu)
 
1
+ # DataBack: Dataset of SAT Formulas and Backbone Variable Phases
2
 
3
  ## What is DataBack
4
+ `DataBack` is a dataset that consists of SAT formulas in CNF format, each labeled with the phases of its backbone variables. Within `DataBack`, there are two distinct subsets: the pre-training set, named `DataBack-PT`, and the fine-tuning set, named `DataBack-FT`. The state-of-the-art backbone extractor called [`CadiBack`](https://wenxiwang.github.io/papers/cadiback.pdf) has been employed to obtain the backbone variable phases. We have allocated a timeout of 1,000 seconds for the pre-training formulas and 5,000 seconds for the fine-tuning ones due to the increased complexity of the fine-tuning formulas.
5
 
6
+ The `DataBack` dataset has been employed to both pre-train and fine-tune our `NeuroBack` model, which has demonstrated significant improvements in SAT solving efficiency. For an in-depth exploration of `DataBack`, please refer to our [`NeuroBack`](https://arxiv.org/pdf/2110.14053.pdf) or [`CadiBack`](https://wenxiwang.github.io/papers/cadiback.pdf) papers.
7
 
8
+ ## Contributors
9
+ Wenxi Wang ([email protected]), Yang Hu ([email protected])
10
 
11
+ ## References
12
+ If you use `DataBack` in your research, please kindly cite the following papers.
13
+
14
+ [`NeuroBack`](https://arxiv.org/pdf/2110.14053.pdf) paper:
15
  ```bib
16
  @article{wang2023neuroback,
17
+ author = {Wang, Wenxi and
18
+ Hu, Yang and
19
+ Tiwari, Mohit and
20
+ Khurshid, Sarfraz and
21
+ McMillan, Kenneth L. and
22
+ Miikkulainen, Risto},
23
+ title = {NeuroBack: Improving CDCL SAT Solving using Graph Neural Networks},
24
  journal={arXiv preprint arXiv:2110.14053},
25
  year={2021}
26
  }
27
  ```
28
+
29
+ [`CadiBack`](https://wenxiwang.github.io/papers/cadiback.pdf) paper:
30
+ ```bib
31
+ @inproceedings{biere2023cadiback,
32
+ title={CadiBack: Extracting Backbones with CaDiCaL},
33
+ author={Biere, Armin and Froleyks, Nils and Wang, Wenxi},
34
+ booktitle={26th International Conference on Theory and Applications of Satisfiability Testing (SAT 2023)},
35
+ year={2023},
36
+ organization={Schloss Dagstuhl-Leibniz-Zentrum f{\"u}r Informatik}
37
+ }
38
+ ```
39
+
40
  ## Directory Structure
41
  ```
42
+ |- original # Original SAT formulas and their backbone variable phases
43
+ | |- cnf_pt.tar.gz # SAT formulas (in CNF format) for model pre-training
44
+ | |- bb_pt.tar.gz # Backbone phases for pre-training formulas
45
+ | |- cnf_ft.tar.gz # SAT formulas (in CNF format) for model fine-tuning
46
+ | |- bb_ft.tar.gz # Backbone phases for fine-tuning formulas
47
  |
48
+ |- dual # Dual SAT formulas and their backbone variable phases
49
+ | |- dual_cnf_pt.tar.gz # Dual SAT formulas (in CNF format) for model pre-training
50
  | |- dual_bb_pt.tar.gz # Backbone phases for dual pre-training CNFs
51
+ | |- dual_cnf_ft.tar.gz # Dual SAT formulas (in CNF format) for model fine-tuning
52
  | |- dual_bb_ft.tar.gz # Backbone phases for dual fine-tuning CNFs
53
  ```
54
 
55
  ## File Naming Convention
56
+ In the original directory, each CNF tar file (`cnf_*.tar.gz`) contains compressed CNF files named: `[cnf_name].[compression_format]`, where `[compression_format]` could be bz2, lzma, xz, gz, etc. Correspondingly, each backbone tar file (`bb_*.tar.gz`) comprises compressed backbone files named: `[cnf_name].backbone.xz`. It is important to note that a compressed CNF file will always share its `[cnf_name]` with its associated compressed backbone file.
57
 
58
+ In the dual directory, the naming convention remains consistent, but with an added `dual_` prefix for each CNF tar file or backbone tar file, and an `d_` prefix for each compressed CNF or backbone file to indicate that it pertains to a dual SAT formula.
59
 
60
+ ## Format of the Extracted Backbone File
61
+ The extracted backbone file (`*.backbone`) adheres to the output format of [`CadiBack`](https://wenxiwang.github.io/papers/cadiback.pdf). Specifically, each line contains the letter 'b' followed by an integer. This integer represents the backbone variable ID and its associated phase.