System Design Space
Knowledge graphSettings

Updated: March 9, 2026 at 9:00 PM

Developing Apps with GPT-4 and ChatGPT (short summary)

easy

Source

Telegram: book_cube

Review post with a practical take on what is still useful and what is already outdated.

Read post

Developing Apps with GPT-4 and ChatGPT

Authors: Enamul Haque, Douglas Murillo, S. M. Nasrul Quader, Benjamin Pyle, Steve Tingiris
Publisher: O'Reilly Media, 2023 (1st Edition)
Length: 125 pages

A concise practical 2023 guide to getting started with LLM apps: OpenAI API basics, prompting, prompt injection mitigation, lightweight fine-tuning, and early LangChain patterns.

Original
Translated

What this book is about

This is a concise 2023 hands-on guide that walks through the starter path: understand LLM basics, wire the API, and build first working use cases without heavy theory overhead.

It still works as a strong onboarding resource, but should be read as a starting point rather than a full up-to-date GenAI system map.

Book structure: 5 short chapters

1. GPT-4 and ChatGPT overview

  • A compact NLP introduction and why transformers became the default architecture for modern LLMs.
  • Historical baseline: from Attention Is All You Need to the GPT model line.
  • Early product examples from education, finance, and media.

2. ChatGPT API deep dive

  • How to create an API key, use Playground, and call models from Python.
  • Request/response format and core generation parameters.
  • Pricing basics and entry-level security considerations for integration.

3. Building simple applications

  • Hands-on focus with input/output design and prompt injection awareness.
  • The same LLM stack is used to build multiple lightweight app scenarios.
  • A practical bridge from playground experiments to a first working product.

4. Prompt engineering and fine-tuning

  • Step-by-step prompting, few-shot patterns, and task decomposition.
  • Introduction to OpenAI fine-tuning and cost implications.
  • Practical sample: email marketing text generation.

5. LangChain and plugins (historical snapshot)

  • Shows an early stage of the ecosystem: LangChain plus ChatGPT plugins.
  • From a 2026 perspective, this section is valuable as an evolution snapshot toward GPTs.
  • Useful for understanding how agent/tool patterns evolved in real products.

What remains useful

  • Very low entry barrier for engineers new to LLM applications.
  • Clear progression from concepts to API usage and runnable mini-demos.
  • Covers a practical prompt-engineering baseline for first production attempts.

What is outdated or incomplete

  • Parts of the API and model lineup are outdated compared with the 2026 landscape.
  • The plugin section reflects a pre-GPTs product era.
  • Evaluation, guardrails, and production reliability are covered only at a basic level.

Demo apps covered in the book

  • News generator
  • YouTube transcript-based summarizer
  • Video game lore assistant
  • Basic voice control interface

Recommended companion reading

To keep this topic current, pair the book with newer materials on evaluation, guardrails, context engineering, and agent workflow design.

Foundational references used in this learning path: GPT / Generative pre-trained transformer and Attention Is All You Need.

For prompt safety in production systems, add current controls from OWASP LLM Top 10.

Related chapters

Where to find the book

Enable tracking in Settings

System Design Space

© 2026 Alexander Polomodov