Introduction
Preparing for SQE1 is about applying law to realistic scenarios, not recalling trivia. The SRA’s specification sets out how single best answer questions should test Day 1 solicitor competence across all Functioning Legal Knowledge (FLK) areas. Questions are drafted and reviewed by practising solicitors, checked for clarity and fairness, and monitored with post-exam data.
This guide turns those standards into a practical set of checks to help you choose a prep provider and review whether their MCQs match the official approach. You’ll find good and bad question examples, a provider process case study, clear quality metrics, and a step-by-step audit you can use straight away.
What You'll Learn
- What the Day 1 (Threshold Standard Level 3) level means for SQE1 MCQs
- How a high-quality single best answer MCQ is built (stem, lead‑in, options)
- Which metrics (difficulty, discrimination, distractor efficiency, DIF, Cronbach’s alpha) show quality
- How to audit a provider’s bank against SRA‑style requirements
- Good and bad MCQ examples you can copy as test rubrics
- A case study of a rigorous, multi‑stage question QA process
- A summary checklist and quick reference for fast provider comparisons
Core Concepts
The SQE1 Day 1 Standard
- Purpose: Test application of core legal principles in realistic, client‑centred scenarios at the level of a competent newly qualified solicitor.
- Format: Single best answer (five options; one defensibly correct) focused on application and judgement, not rote recall.
- Development and review:
- Written by practising solicitors (subject-matter specialists).
- Peer reviewed by other solicitors, including generalists for clarity.
- Updated when the law changes, with documented cut‑off dates.
- Data checks (post‑use):
- Item difficulty (facility), discrimination, distractor performance, and differential item functioning (DIF) to support fairness.
- Test‑level reliability (e.g., Cronbach’s alpha) and standard error of measurement.
- Method sources: Approaches used in professional exams (e.g., Case & Donahue 2008; Paniagua & Swygert 2016; Scully 2017; Ali et al. 2016; Tavakol & Dennick 2011; Livingston 2018; Abdul Rahim et al. 2022).
Anatomy of a Single Best Answer MCQ
A strong SBA has three parts, each with a clear job.
-
Stem (scenario)
- Realistic and concise; relevant to solicitor work.
- Contains all material facts; avoids padding or stereotypes.
- Clear, neutral English; no loaded or sensational topics.
-
Lead‑in (the question)
- One specific task in positive form.
- Passes the “cover test”: you can reason towards an answer before seeing options.
- Avoids double negatives and vague prompts like “Which statement is true?”
-
Options (A–E)
- One correct option; four plausible distractors based on common mistakes or partial truths.
- Parallel wording and similar length to reduce clueing.
- No new facts in options; all facts sit in the stem.
- Avoid absolutes (“always/never”) unless the law is genuinely absolute.
Quality Metrics and Fairness Controls
Good providers use and share (at least in summary) the following:
- Difficulty (facility): Proportion who answer correctly. Healthy banks include a spread, with many items around 0.3–0.7.
- Discrimination: Stronger candidates choose the correct answer more often. Weak items fail to separate ability.
- Distractor efficiency: Wrong options attract meaningful selections; dead distractors are rewritten.
- DIF: Items perform consistently across demographic groups of comparable ability.
- Reliability: Cronbach’s alpha and standard error of measurement reported at test level.
- Accessibility and plain English: Neutral characters (“client/solicitor”), straightforward wording, inclusive scenarios, and accessible design.
Key Examples or Case Studies
Good MCQ Example
Stem: An architect received a leaflet from website designers advertising their website design packages. On the back of the leaflet was a copy of the website designers' standard terms, which contained a limitation clause. The architect wrote a letter to the website designers asking them to design his website and he attached to his letter a copy of his own standard terms and conditions, which did not contain a limitation clause. The architect received a quotation for £2,500 from the website designers. The architect signed and returned a tear-off slip to the website designers which stated that he accepted the quotation on the website designers' standard terms and conditions.
Lead‑in: Which of the following statements best describes the legal position?
Options: A. The terms on the leaflet constituted an offer which the architect accepted by asking for a quotation. B. The quotation constituted an offer which the architect accepted on the website designers' standard terms and conditions. C. The quotation constituted an offer which the architect accepted on the architect's standard terms and conditions. D. The letter from the architect to the web designers constituted an offer which the web designers accepted by sending a quotation. E. The letter from the architect to the web designers constituted a counter offer which the web designers accepted by sending a quotation.
Correct answer: B.
Why it meets the standard
- Stem: Realistic commercial scenario about contract formation and battle of the forms; all material facts included and relevant.
- Lead‑in: Clear, focused task testing the sequence of offer and acceptance.
- Options: One correct answer (quotation = offer; acceptance on designers' terms); plausible distractors reflecting common errors about leaflets, whose terms prevail, and timing.
Bad MCQ Example
Stem: John Smith, a self-employed architect from Manchester who runs Smith Architecture Ltd, received a leaflet from website designers advertising their website design packages. On the back of the leaflet was a copy of the website designers' standard terms, which contained a limitation clause. John wrote a letter to the website designers asking them to design his website and he attached to his letter a copy of his own standard terms and conditions, which did not contain a limitation clause. John received a quotation for £2,500 from the website designers. John signed and returned a tear-off slip to the website designers which stated that he accepted the quotation on the website designers' standard terms and conditions.
Lead‑in: Which of the following is not incorrect?
Options: A. The terms on the leaflet constituted an offer which the architect accepted by asking for a quotation. B. The quotation constituted an offer which the architect accepted on the website designers' standard terms and conditions. C. A contract was formed when the architect received the quotation because he had already expressed interest. D. No contract exists because the architect's terms conflicted with the designers' terms. E. The limitation clause is always unenforceable in consumer contracts.
Correct answer: B.
Why it fails
- Stem adds irrelevant personal details (full name, location, company name) that add no legal value.
- Lead‑in uses a double negative, increasing cognitive load.
- Option C adds new facts not in the stem (timing based on “expressing interest”).
- Option D makes an absolute claim about battle of the forms.
- Option E brings in irrelevant consumer law for a business‑to‑business scenario.
- Weak distractors make the correct answer obvious by elimination.
Case Study (Provider Process Aligned with SRA Practice)
One model uses a staged workflow that matches standard practice:
- Authoring: Practising solicitors draft scenario‑based SBAs aligned with the FLK specification and Day 1 level.
- Peer review: Independent solicitors check legal accuracy, clarity, fairness, and coverage.
- Automated checks: Scripts validate structure (five options; one best answer; parallel language), style (plain English), and option length parity.
- Legal QA: Items checked against a published legal cut‑off date and correct jurisdiction (England and Wales).
- Data loop: After pilots or live use, items are reviewed using facility, discrimination, distractor efficiency, and response time data. Weak items are revised or retired; content mapping tracks topic balance.
This process signals strong quality control, currency, and fairness.
Practical Applications
Use this step‑by‑step audit when choosing a provider.
1. Check Alignment
What to ask:
- Show their mapping to the SRA FLK specification and Day 1 competency.
- Provide a content matrix covering topics and cognitive level.
- Confirm they distinguish SQE1 (Day 1) from SQE2 levels.
What to look for:
- Full coverage of FLK areas with appropriate weighting.
- Scenario‑based SBAs focused on application rather than recall.
- Scheduled legal updates tied to a documented cut‑off date.
2. Inspect Item Quality
Sample size and selection:
- Review 20–25 SBAs across topics, including easier and harder items.
- Ask for recently updated items.
Apply systematic checks:
- Cover test: Can you reason towards an answer from the stem and lead‑in alone?
- Options: Parallel structure, similar length, grammatically consistent.
- Clueing: Avoid hedging only in correct options or absolutes only in distractors.
- Fact placement: All material facts in the stem; none added in options.
Red flags:
- Correct answer obvious by elimination.
- Irrelevant personal or sensational details in the stem.
- Options that introduce new legal concepts not raised in the stem.
3. Review Explanations
Quality indicators:
- Clear rationale for the correct option as the best legal analysis.
- Each distractor addressed with specific reasons.
- Notes on common misconceptions.
- References to key cases or statutes where helpful.
Avoid:
- Bare assertions, circular reasoning, or missing distractor feedback.
- Overlong explanations that bury the point.
Ask:
- How explanations are checked and updated.
- Whether explanations reference SRA guidance where relevant.
- If explanations include learning objectives or competency mapping.
4. Ask About the Writing Process
Author qualifications:
- Practising solicitors with recent experience.
- Mix of specialists and generalists matched to topic areas.
Review process:
- Multi‑stage review including legal and educational checks.
- Use of non‑specialist solicitors for clarity and accessibility.
- Documented process for resolving reviewer disagreements.
Currency and updates:
- Published legal cut‑off dates per cycle.
- Monitoring of legal changes affecting FLK.
- Retirement or replacement of outdated items with audit notes.
5. Look for Data and Feedback
Analytics and reporting:
- Dashboards for strengths/weaknesses by topic.
- Time analysis per question and per section.
- Progress tracking across sessions.
- Optional anonymised cohort comparisons.
Item performance metrics:
- Facility spread from easy to hard.
- Discrimination indices showing separation of ability.
- Distractor analysis to confirm functioning wrong options.
- Post‑use review and revision programme.
Mock exam features:
- Interface close to the real exam.
- Accurate timing and item counts.
- Post‑exam analysis with targeted practice options.
6. Fairness and Accessibility
Language and representation:
- Neutral labels (“the client”, “the solicitor”).
- Plain English with short, readable sentences.
- Scenarios that avoid stereotypes and assumptions.
Technical accessibility:
- Adjustable font and high‑contrast themes.
- Keyboard navigation and screen reader compatibility.
- Features supporting different learning needs.
Content sensitivity:
- Professional tone for sensitive topics.
- Diverse contexts reflecting modern practice.
7. Security and Variety
Question bank management:
- Large enough bank to reduce memorisation (hundreds per topic is typical).
- Regular rotation and refresh.
- Item exposure tracking.
Technical security:
- Randomised question and option order.
- Session controls to deter harvesting.
- Secure platform and clear terms of use.
Quality maintenance:
- Regular audits to find and fix outdated or weak items.
- User feedback channel with rapid triage and corrections.
8. Try a Timed Sample
Test conditions:
- Sit a 30–45 minute session under exam‑like timing.
- Use the standard interface; no notes.
- Mix topics to reflect the exam.
During and after:
- Did stems include all needed facts without padding?
- Did distractors attract plausible choices?
- Did timing feel realistic?
- Was topic coverage balanced?
Self‑assessment:
- Knowledge gaps: Unfamiliar law.
- Application issues: Law known but hard to apply to facts.
- Item clarity: Ambiguous wording or options.
- Time management: Pacing and hesitation points.
Follow‑up questions to the provider:
- How do your sample stats compare with real SQE1 performance?
- What support is offered for weak areas?
- How do you help me separate knowledge gaps from comprehension issues?
- What extra resources target low‑performing topics?
Summary Checklist
- Clear mapping to SRA FLK and Day 1 level
- Scenario‑based SBAs with concise stems and focused lead‑ins
- Options are parallel, similar length, and plausibly wrong when incorrect
- No negative or ambiguous lead‑ins; items pass the cover test
- Explanations justify the correct answer and address each distractor
- Bank covers all FLK areas with a mix of difficulty levels
- Legal updates scheduled and tied to a published cut‑off date
- Post‑use metrics tracked: difficulty, discrimination, distractor efficiency
- Fairness controls: plain English, neutral characters, DIF monitoring
- Timed mocks with a realistic interface and clear reporting
Quick Reference
Check/Metric | What Good Looks Like | Why It Matters |
---|---|---|
Lead‑in quality | One clear task; passes cover test | Tests application rather than guesswork |
Options (A–E) | One best; parallel; no new facts | Reduces clueing; improves validity |
Difficulty (facility) | Mixed bank; many items around 0.3–0.7 | Supports a steady learning curve |
Discrimination | Positive, moderate to strong | Separates stronger from weaker candidates |
Distractor efficiency | All wrong options attract some responses | Confirms distractors are working |