Flux-John-William-Godward-LoKr-lr_2e-4

This is a LyCORIS adapter derived from black-forest-labs/FLUX.1-dev.

No validation prompt was used during training.

None

Validation settings

  • CFG: 3.0
  • CFG Rescale: 0.0
  • Steps: 20
  • Sampler: None
  • Seed: 42
  • Resolution: 1024x1024

Note: The validation settings are not necessarily the same as the training settings.

You can find some example images in the following gallery:

Prompt
unconditional (blank prompt)
Negative Prompt
blurry, cropped, ugly
Prompt
In the style of a g0dw4rd painting, A woman in a flowing violet silk dress reclines on white marble steps, her hand trailing over intricate mosaic tiles. Beside her stands a bronze vessel filled with white lilies. The background reveals a sun-drenched Mediterranean terrace with distant azure seas and cypress trees.
Negative Prompt
blurry, cropped, ugly
Prompt
In the style of a g0dw4rd painting, A woman in diaphanous gold and cream robes sits on a carved marble bench, reading from an ancient scroll. Behind her, Ionic columns frame a view of the Tyrrhenian Sea, while a copper brazier releases thin wisps of incense into the warm afternoon light.
Negative Prompt
blurry, cropped, ugly
Prompt
In the style of a g0dw4rd painting, A woman in rose-pink silk lounges on a leopard-skin throw draped over marble steps. Purple wisteria cascades from a pergola above, while peacocks strut across the terrace. The background shows a glimpse of Naples Bay with Vesuvius in the misty distance.
Negative Prompt
blurry, cropped, ugly
Prompt
In the style of a g0dw4rd painting, A woman in iridescent robes tends to a mysterious device of crystal and brass on a marble altar. Holographic equations float in the air around her while maintaining classical architectural elements. The background shows a fusion of ancient columns and quantum probability clouds against a Mediterranean sky.
Negative Prompt
blurry, cropped, ugly
Prompt
In the style of a g0dw4rd painting, A woman in traditional Roman dress interfaces with translucent digital screens floating between marble columns. Her silk robes shimmer with embedded circuits while maintaining classical drapery. The setting sun casts rose-gold light across the technological artifacts and ancient stonework.
Negative Prompt
blurry, cropped, ugly
Prompt
In the style of a g0dw4rd painting, A woman in a classical peplos of shifting temporal energies sits before a marble chronometer. Threads of past and future weave through her fingers while maintaining Godward's attention to fabric texture. The background shows overlapping visions of ancient Rome and future cityscapes.
Negative Prompt
blurry, cropped, ugly
Prompt
In the style of a g0dw4rd painting, A woman in neoclassical dress reclines on marble steps while neural networks visualize as golden threads around her. Her chiton appears traditional but subtly displays flowing data patterns. Behind her, classical architecture merges with crystalline computational structures.
Negative Prompt
blurry, cropped, ugly
Prompt
In the style of a g0dw4rd painting, A woman in a traditional stola tends to bioluminescent plants growing between marble columns. Her classical garments contain subtle patterns of DNA helices, while maintaining proper dress folds and textures. The Mediterranean sunset illuminates hybrid flowers that are part organic, part technological.
Negative Prompt
blurry, cropped, ugly

The text encoder was not trained. You may reuse the base model text encoder for inference.

Training settings

  • Training epochs: 10
  • Training steps: 7000
  • Learning rate: 0.0002
  • Max grad norm: 2.0
  • Effective batch size: 4
    • Micro-batch size: 4
    • Gradient accumulation steps: 1
    • Number of GPUs: 1
  • Prediction type: flow-matching (flux parameters=['flux_guidance_value=1.0'])
  • Rescaled betas zero SNR: False
  • Optimizer: adamw_bf16
  • Precision: Pure BF16
  • Quantised: Yes: int8-quanto
  • Xformers: Not used
  • LyCORIS Config:
{
    "algo": "lokr",
    "multiplier": 1.0,
    "linear_dim": 10000,
    "linear_alpha": 1,
    "factor": 16,
    "apply_preset": {
        "target_module": [
            "Attention",
            "FeedForward"
        ],
        "module_algo_map": {
            "Attention": {
                "factor": 16
            },
            "FeedForward": {
                "factor": 8
            }
        }
    }
}

Datasets

john-willaim-godward-512

  • Repeats: 17
  • Total number of images: 43
  • Total number of aspect buckets: 10
  • Resolution: 0.262144 megapixels
  • Cropped: False
  • Crop style: None
  • Crop aspect: None
  • Used for regularisation data: No

john-willaim-godward-768

  • Repeats: 17
  • Total number of images: 43
  • Total number of aspect buckets: 12
  • Resolution: 0.589824 megapixels
  • Cropped: False
  • Crop style: None
  • Crop aspect: None
  • Used for regularisation data: No

john-willaim-godward-1024

  • Repeats: 5
  • Total number of images: 43
  • Total number of aspect buckets: 13
  • Resolution: 1.048576 megapixels
  • Cropped: False
  • Crop style: None
  • Crop aspect: None
  • Used for regularisation data: No

Inference

import torch
from diffusers import DiffusionPipeline
from lycoris import create_lycoris_from_weights

model_id = 'black-forest-labs/FLUX.1-dev'
adapter_id = 'pytorch_lora_weights.safetensors' # you will have to download this manually
lora_scale = 1.0
wrapper, _ = create_lycoris_from_weights(lora_scale, adapter_id, pipeline.transformer)
wrapper.merge_to()

prompt = "An astronaut is riding a horse through the jungles of Thailand."

pipeline.to('cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu')
image = pipeline(
    prompt=prompt,
    num_inference_steps=20,
    generator=torch.Generator(device='cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu').manual_seed(1641421826),
    width=1024,
    height=1024,
    guidance_scale=3.0,
).images[0]
image.save("output.png", format="PNG")
Downloads last month
0
Inference Providers NEW
Examples

Model tree for davidrd123/Flux-John-William-Godward-LoKr-lr_2e-4

Adapter
(23754)
this model