BasicRandomSearch (Black-Box Attack)
This page documents the BasicRandomSearch algorithm (SimBA variant), a Black-box Adversarial Attack that only requires query access to model predictions.
BasicRandomSearch Implementation
AdversarialAttacks.BasicRandomSearch — Type
BasicRandomSearch(; epsilon=0.1, max_iter=50, bounds=nothing, rng=Random.default_rng())Subtype of BlackBoxAttack. Creates adversarial examples using the SimBA random search algorithm. Based on Guo, C., Gardner, J., You, Y., Wilson, A. G., & Weinberger, K. (2019, May). Simple black-box adversarial attacks. In International conference on machine learning (pp. 2484-2493). PMLR.
Arguments
epsilon: Step size for perturbations (default: 0.1).max_iter: Maximum number of iterations for searching (default: 50). Each iteration randomly selects a coordinate to perturb.bounds: Optional vector of (lower, upper) tuples specifying per-feature bounds. Ifnothing, defaults to [0, 1] for all features (suitable for normalized images). For tabular data, provide bounds matching feature ranges, e.g.,[(4.3, 7.9), (2.0, 4.4), ...]for Iris-like data.rng: Random number generator for reproducibility (default:Random.default_rng()).
AdversarialAttacks.attack — Method
attack(atk::BasicRandomSearch, model::Chain, sample; detailed_result)Perform a Black-box Adversarial Attack on the given model using the provided sample using the Basic Random Search variant SimBA.
Arguments
atk::BasicRandomSearch: An instance of the BasicRandomSearch (Black-box) attack.model::Chain: The machine learning (deep learning, classical machine learning) model to be attacked.sample: Input sample as a named tuple withdataandlabel.detailed_result::Bool=false: Return format control.false(default): Returns adversarial example only (Array).true: Returns NamedTuple with metrics (xadv, success, queriesused, final_label).
Returns
- If
detailed_result=false: Adversarial example (same type assample.data). - If
detailed_result=true: NamedTuple containing:x_adv: Adversarial example.success::Bool: Whether attack succeeded.queries_used::Int: Number of model queries.final_label::Int: Final predicted class.
AdversarialAttacks.attack — Method
attack(atk::BasicRandomSearch, model::DecisionTreeClassifier, sample; detailed_result)Perform a Black-box adversarial attack on a DecisionTreeClassifier using BasicRandomSearch (SimBA).
Arguments
atk::BasicRandomSearch: Attack instance withepsilonand optionalbounds.model::DecisionTreeClassifier: DecisionTree.jl classifier to attack.sample: NamedTuple withdataandlabelfields.detailed_result::Bool=false: Return format control.false(default): Returns adversarial example only (Array).true: Returns NamedTuple with metrics (xadv, success, queriesused, final_label).
Returns
- If
detailed_result=false: Adversarial example (same type assample.data). - If
detailed_result=true: NamedTuple containing:x_adv: Adversarial example.success::Bool: Whether attack succeeded.queries_used::Int: Number of model queries.final_label::Int: Final predicted class.
Quick Example
using AdversarialAttacks
atk = BasicRandomSearch(
epsilon = 0.3f0,
bounds = [(0.0f0, 1.0f0), (0.0f0, 1.0f0)],
max_iter = 50,
)
println("Attack: ", name(atk))
println("Type check: ", atk isa BlackBoxAttack)
println("Epsilon: ", atk.epsilon, ", Max iter: ", atk.max_iter)Attack: BasicRandomSearch{Float32, Vector{Tuple{Float32, Float32}}, Random.TaskLocalRNG}
Type check: true
Epsilon: 0.3, Max iter: 50