The First VoicePrivacy Attacker Challenge is a new kind of challenge organized as part of the VoicePrivacy initiative. It focuses on developing attacker systems against voice anonymization, which will be evaluated against state-of-the-art anonymization systems including some submitted to the VoicePrivacy 2024 Challenge. Training, development, and evaluation datasets are provided along with baseline attacker systems. To develop attacker systems, the Challenge participants can use any additional training data and models, provided that they are openly available and declared before the specified deadline. Participants should develop their attacker systems in the form of automatic speaker verification systems and submit their scores on the test and development data to the organizers. The metric for evaluation is equal error rate (EER).
Task: develop an attacker system against voice anonymization systems in the form of an automatic speaker verification (ASV) system.
Data for training, development, and evaluation are provided along with baseline attacker systems. To develop attacker systems, the Challenge participants can use any additional training data and models, provided that they are openly available and declared before the specified deadline.
Metric: equal error rate (EER) resulting from the ASV attacker.
Results and accepted challenge papers will be presented at a special session at ICASSP 2025, Hyderabad, India.
All deadlines are set for 11:59 PM on the respective day in US Pacific Standard Time
Participants are requested to register for the evaluation. Registration should be performed once only for each participating entity using the Registration form.
Participants remain the owners of their code and are not required to share it. No intellectual property rights are transferred to the challenge organizers.
There is no limit on the number of submitted systems. Each participant may submit scores/results for one or more attacker systems, each targeting all anonymization systems or only some of them. The number of submitted attacker systems for each anonymization system should correspond to the number of submitted compressed archive files for this anonymization system. If the same attacker was applied to multiple anonymization systems, then each of these applications should correspond to a different submission file.
Each single submission should include the EER and corresponding ASV scores (for the development and evaluation data) obtained with the proposed attacker system for 4 trial lists (in the same format as generated by the baseline attacker system, see an example link in Section 7 of the evaluation plan):
Name of the archive file should be <Team name>_<Anonymization system id>.<Attacker id>.tar.gz
(or zip
), where
Team name
- team name used in the registration form.Anonymization system id
- id of the anonymization system as written in the evaluation plan (B3
, B4
, B5
, T8-5
, T10-2
, T12-5
, or T25-1
) should be exactly the same in the archive file and in the system description.Attacker id
- id of the attacker system, should be exactly the same as in the system description.All the submitted systems must be described in detail in the system descriptions. The challenge ranking will include only those systems which are compliant with the challenge rules (described in the evaluation plan) and for which we receive before the deadlines: scores, EER results, and complete and clear system description.
System description should contain all the results for the submitted systems. System descriptions may also include additional results without data scores and results submission, but these results will not be included in the official ranking. We also recommend adding a table that provides a summary description of each submitted system (modules/components with their short descriptions, input/output features, training data and open-source models used in training/development).