F5-Blog


Four Reasons Why the Public Sector Turns to F5

See why the public sector relies on F5 to secure its apps from increasingly sophisticated cyberattacks—while delivering the performance constituents demand.

No “One Model to Rule Them All” with Generative AI

Three AI deployment patterns are emerging, each with their own operational responsibilities and trade-offs. The choice of model and deployment pattern should be strategic while acknowledging the fast-evolving landscape of generative AI for enterprises.

Building a Vibrant LGBTQ+ Community at F5

The F5 Pride EIG builds community and advocates for the company’s LGBTQ+ employees, and seeks broader impact by extending allyship to employees’ family and friends.

Three Things You Should Know about AI Applications

AI applications are modern apps, but they do have differences. Learn the key things to know when it comes to AI-powered applications.

F5 is Scaling AI Inference from the Inside Out

Don't let GPU resources sit idle, build out scalable and secure AI compute complexes with the right hardware that lets inferencing inference.

From OpenAPI to NGINX as an API Gateway Using a Declarative API

Explore how to transform an OpenAPI schema definition into a fully functioning NGINX configuration running as an API Gateway with Web Application Firewall security and a Developer Portal using a declarative API approach.

The Impact of AI Inferencing on Data Center Architecture

To support the full AI life cycle, organizations will require significant updates in their architecture—specifically changes in the network. Failure to do so may lead to the inability scale and unreliable operations.

NGINX One Update: New Features Released

NGINX One is the next phase of our journey and an effort to make all NGINX products easier to configure, secure, scale and manage.

State of AI Application Strategy: Digital Maturity or Madness

Explore F5's 'State of AI Application Strategy', focusing on enterprise AI adoption, security concerns, technology stack, and model management practices.

AI Inference Patterns

AI inference services enable AI access for developers, and can be consumed in a variety of ways. Key patterns include SaaS, Cloud Managed, and Self-Managed, each with unique trade-offs in scalability, cost, and data control.