← Back to Module

Web Application Critique

CMM721: Web Application Development - Session 10

Birmingham Newman University

Lecturer: James Williams

Masters Level Conversion Course

3-hour session • 18 slides • 2 tasks • Live evaluation demos

Session Timeline:

  • 10 min: Registration & waiting
  • 20 min: Opening slides
  • 45 min: Task 1
  • 15 min: Break/Catch up
  • 20 min: Secondary slides
  • 45 min: Task 2
  • Remaining: Self-study

Learning Objectives

  • Evaluate web application usability
  • Assess performance and optimization
  • Conduct accessibility audits (WCAG)
  • Check standards compliance
  • Identify security vulnerabilities
  • Consider ethical implications

Why Critique Web Applications?

  • Learning: Understand what makes good UX
  • Improvement: Identify areas for enhancement
  • Standards: Ensure compliance with web standards
  • Accessibility: Make web inclusive for all users
  • Performance: Optimize user experience
  • Security: Protect users and data
Critical Analysis: Essential skill for professional developers

Usability Evaluation Framework

Learnability

How easy is it for first-time users?

Efficiency

Can experienced users perform tasks quickly?

Memorability

Can users remember how to use it?

Errors

How many errors? How severe? Easy recovery?

Satisfaction

Is the experience pleasant?

Accessibility

Can everyone use it?

Nielsen's 10 Usability Heuristics

  1. Visibility of system status - Keep users informed
  2. Match between system and real world - Use familiar language
  3. User control and freedom - Provide undo/redo
  4. Consistency and standards - Follow conventions
  5. Error prevention - Better than good error messages
  6. Recognition rather than recall - Minimize memory load
  7. Flexibility and efficiency - Accelerators for experts
  8. Aesthetic and minimalist design - No unnecessary info
  9. Help users recognize, diagnose, and recover from errors
  10. Help and documentation - When needed, easy to search

🎯 LIVE DEMO: Usability Analysis

Example: E-commerce Checkout

✅ Good Practices:

  • Progress indicator shows current step (1 of 3)
  • Form fields clearly labeled with helpful hints
  • Real-time validation with clear error messages
  • Guest checkout option (no forced registration)
  • Order summary visible throughout process

❌ Bad Practices:

  • No save progress - lose everything on refresh
  • Requires account creation before showing total cost
  • CAPTCHA on every page
  • Timeout after 5 minutes with no warning
  • No back button - must restart entire process

Performance Evaluation

Key Metrics:

  • First Contentful Paint (FCP): < 1.8s
  • Largest Contentful Paint (LCP): < 2.5s
  • Time to Interactive (TTI): < 3.8s
  • Cumulative Layout Shift (CLS): < 0.1
  • Total Blocking Time (TBT): < 200ms

Tools:

  • Google Lighthouse
  • WebPageTest
  • Chrome DevTools Performance tab
  • GTmetrix

🎯 Performance Score Example

Lighthouse Audit Results

92
Performance
98
Accessibility
100
Best Practices
87
SEO

Scores: 90-100 = Good, 50-89 = Needs Improvement, 0-49 = Poor

Task 1: Usability & Performance Audit

Objective:

Conduct comprehensive usability and performance evaluation

Requirements:

  1. Select a live web application (e.g., e-commerce, news, social)
  2. Apply Nielsen's 10 heuristics - rate each (1-5)
  3. Run Lighthouse audit - analyze all 4 categories
  4. Test on mobile and desktop
  5. Document 5 usability issues with severity ratings
  6. Identify 3 performance bottlenecks
  7. Provide actionable recommendations

Deliverables:

  • Written report (1000-1500 words)
  • Screenshots with annotations
  • Lighthouse report screenshots

Accessibility (WCAG 2.1)

WCAG: Web Content Accessibility Guidelines

Four Principles (POUR):

  • Perceivable: Can users perceive the content?
  • Operable: Can users operate the interface?
  • Understandable: Can users understand the content?
  • Robust: Works with assistive technologies?

Conformance Levels:

Level A Minimum
Level AA Required (legal standard)
Level AAA Enhanced

Common Accessibility Issues

Color Contrast

Text not readable against background (WCAG 4.5:1 ratio)

Missing Alt Text

Images without descriptive alternatives

Keyboard Navigation

Can't navigate without mouse

Form Labels

Inputs without associated labels

Heading Structure

Skipped heading levels (h1→h3)

ARIA Misuse

Incorrect or unnecessary ARIA attributes

🎯 LIVE DEMO: Accessibility Testing

Testing Tools & Techniques

Automated Tools:

  • WAVE: Web Accessibility Evaluation Tool
  • axe DevTools: Browser extension
  • Lighthouse: Accessibility score
  • Pa11y: Command-line tool

Manual Tests:

  • Keyboard only: Tab through entire page
  • Screen reader: NVDA (Windows) or VoiceOver (Mac)
  • Zoom to 200%: Check text reflow
  • Color blindness: Use ColorOracle simulator
  • Disable CSS: Check semantic structure

Code Examples: Accessibility

❌ Bad:

<div onclick="submitForm()">Submit</div>
<img src="logo.png">
<input type="text" placeholder="Name">
<span style="color: #ccc">Click here</span>

✅ Good:

<button type="submit">Submit</button>
<img src="logo.png" alt="Company Logo">
<label for="name">Name:</label>
<input type="text" id="name" name="name">
<a href="/page" style="color: #0066cc">Read more about our services</a>

Standards Compliance

HTML Validation:

  • W3C Markup Validator: validator.w3.org
  • Check for proper DOCTYPE
  • Validate semantic structure
  • Ensure closing tags

CSS Validation:

  • W3C CSS Validator: jigsaw.w3.org/css-validator
  • Check vendor prefixes
  • Validate syntax

Best Practices:

  • Use semantic HTML5 elements
  • Follow HTML/CSS specifications
  • Test across browsers

Task 2: Accessibility Audit

Objective:

Conduct comprehensive accessibility evaluation

Requirements:

  1. Select a government or education website
  2. Run WAVE accessibility tool - document all errors
  3. Run Lighthouse accessibility audit
  4. Test keyboard navigation - document issues
  5. Test with screen reader (NVDA/VoiceOver)
  6. Check color contrast (minimum 4.5:1)
  7. Validate HTML and check semantic structure
  8. Rate against WCAG 2.1 Level AA criteria

Deliverables:

  • Accessibility audit report (1500-2000 words)
  • WCAG compliance checklist
  • Prioritized recommendations

Security Considerations

Common Vulnerabilities:

  • XSS: Cross-Site Scripting
  • CSRF: Cross-Site Request Forgery
  • SQL Injection: Unsanitized database queries
  • Insecure Authentication: Weak passwords, no 2FA
  • Exposed Sensitive Data: API keys in client code
  • Missing HTTPS: Unencrypted connections

Security Headers:

Content-Security-Policy
X-Content-Type-Options: nosniff
X-Frame-Options: DENY
Strict-Transport-Security

Ethical Considerations

Privacy:

  • Clear privacy policy
  • Informed consent for data collection
  • GDPR compliance (for EU users)
  • Right to deletion

Dark Patterns (Anti-patterns):

  • Trick questions: Confusing opt-out language
  • Roach motel: Easy to get in, hard to leave
  • Forced continuity: Hidden recurring charges
  • Bait and switch: Misleading UI elements
Ethics: Design with user's best interest in mind

🎯 Dark Pattern Examples

Common Manipulative Designs

❌ Confirm Shaming

"No thanks, I don't want to save money"

Makes user feel bad for declining

❌ Hidden Costs

£10 product → £25 at checkout (fees, shipping)

Surprise charges at final step

❌ Misdirection

Giant "DOWNLOAD" ad vs tiny actual download button

Deliberately confusing UI

✅ Ethical Alternative

Clear options: "Yes, I'd like updates" or "No thanks"

Honest, respectful language

Critique Report Structure

  1. Introduction: App overview, purpose, target audience
  2. Methodology: Tools used, testing approach
  3. Usability Analysis: Heuristic evaluation, issues found
  4. Performance: Metrics, bottlenecks, recommendations
  5. Accessibility: WCAG compliance, barriers identified
  6. Standards: HTML/CSS validation, browser compatibility
  7. Security: Vulnerabilities, best practices
  8. Ethics: Privacy, dark patterns, social responsibility
  9. Recommendations: Prioritized action items
  10. Conclusion: Overall assessment, rating

Best Practices

  • Be objective: Support claims with evidence
  • Be specific: "Button too small" → "Submit button is 20x20px (should be min 44x44px)"
  • Prioritize: Critical → High → Medium → Low
  • Be constructive: Offer solutions, not just problems
  • Consider context: Target audience, use cases
  • Test thoroughly: Multiple devices, browsers, scenarios
  • Document everything: Screenshots, metrics, quotes

Next Session: Comparative Critique

  • Comparing multiple web applications
  • Framework comparison (React vs Vue vs Angular)
  • Design decisions and trade-offs
  • Scalability and maintainability
  • Technology stack evaluation
  • Competitive analysis

Preparation: Select 2-3 similar apps to compare (e.g., Twitter/Mastodon/Bluesky)

Questions & Discussion

Contact: JWilliams@Staff.newman.ac.uk

Office Hours: By appointment

Resources: WCAG Guidelines, Nielsen Norman Group, Web.dev

Thank you for your attention!