GitHub is linking developers with security pros to reduce the number of vulnerabilities that may be hiding in code that already is in workflows.The highly popular Microsoft-owned code repository this week said its security campaigns, which were released in public preview in October 2024, are now generally available to GitHub Advanced Security and GitHub Code Security developers.The AI-powered tool is the latest step by GitHub to help developers address security issues that can accumulate in fast-paced CI/CD development processes – what’s called the security debt. GitHub in 2023 built Copilot Autofix into pull requests, which the company said allowed programming…
Author: drweb
Photo by Markus Winkler on Unsplash This came up one day at my work when a developer was using it. I hadn’t used it before and thought I’d better check it out. It’s off by default in SQL server, but why would you use it? Table of Contents TL;DR What is XACT_ABORT? Why Use XACT_ABORT? When Not to Use XACT_ABORT Without XACT_ABORT ON With XACT_ABORT ON TL;DR Without XACT_ABORT ON, only some errors cause a full rollback, while others might leave the transaction partially committed. With XACT_ABORT ON, all errors cause an immediate rollback, preventing partial data changes. Setting XACT_ABORT…
ZenCoder has updated its artificial intelligence (AI) platform for writing and testing code to provide integration with third-party DevOps tools such as JIRA, GitHub, GitLab and Sentry in addition to tighter coupling with VS Code and JetBrains integrated development environments.At the same time, the company has added a “Coffee Mode” capability that now allows Zebcoder AI agents to perform tasks in the background.Finally, Zencoder reports that with this update it has, based on the SWE-Bench-Multimodal benchmark, improved overall performance by a factor of two.ZenCoder CEO Andrew Filev said the overall goal is to accelerate application development by integrating the company’s…
Software startups have to make a lot of decisions as they move through the stages of building a thriving business. Among the many issues to debate is whether or not to open-source your technology. It is a big decision, and the licensing around open source receives a lot of attention in tech circles. Part of the issue is that open source comes with a lot of strong opinions. Whenever a large company decides to restrict its license, even if it is for valid reasons, it can receive a lot of backlash (as HashiCorp and Elastic learned in recent years). On…
Have you ever wondered why a given binary or package installed on your system does not work according to your expectations, meaning it does not function correctly as it is supposed to, or perhaps it cannot even start at all? While downloading packages, you may face challenges such as unsteady network connections or unexpected power blackouts. This can result in the installation of a corrupted package. Considering this an important factor in maintaining uncorrupted packages on your system, it is therefore a vital step to verify the files on the file system against the information stored in the package. In…
Today, we are excited to announce the release of a new, open-source Docker Language Server and Docker DX VS Code extension. In a joint collaboration between Docker and the Microsoft Container Tools team, this new integration enhances the existing Docker extension with improved Dockerfile linting, inline image vulnerability checks, Docker Bake file support, and outlines for Docker Compose files. By working directly with Microsoft, we’re ensuring a native, high-performance experience that complements the existing developer workflow. It’s the next evolution of Docker tooling in VS Code — built to help you move faster, catch issues earlier, and focus on what…
Why the Divide No Longer Makes SenseWhen it comes to traditional vs. modern apps, the days of making a binary choice are long gone. Organizations must rely on both, as many traditional business-critical applications often run on virtual machines, while modern applications are typically container-based. Yet, inexplicably and to their detriment, many enterprises still treat virtual machines (VMs) and containerized apps as separate entities—splitting their infrastructure and platforms into two distinct halves. The problem? Each half requires its own licenses, its own management, and its own financial and operational overhead. This results in inefficiency multiplying at an alarming rate—slowing innovation,…
Google this week previewed a bevy of artificial intelligence (AI) agents and platforms that enable application developers and the DevOps teams that support them to automate a wide range of software engineering tasks.Announced at the Google Cloud Next 2025 conference, the latest editions to the Gemini Code Assist and Gemini Cloud Assist portfolio include AI agents that via a natural language chat interface can generate code using product specifications in Google Docs, migrate code from one language to another, create code to address issues described in a GitHub repository, generate and run tests and, finally, create documentation.Google is also previewing…
In the world of troubleshooting and collaborative debugging, sharing command-line output and error logs is an essential task. Whether you’re asking for help in online forums, communicating with colleagues, or submitting bug reports, providing clear, concise, and easily accessible logs can save everyone time and effort. If you need an easy and efficient way to share your terminal output, termbin is a great tool to use, as it allows you to quickly and securely share any terminal output by providing a unique URL to access the logs. In this article, we’ll take a look at termbin, how to use it,…
Generative AI is transforming software development, but building and running AI models locally is still harder than it should be. Today’s developers face fragmented tooling, hardware compatibility headaches, and disconnected application development workflows, all of which hinder iteration and slow down progress. That’s why we’re launching Docker Model Runner — a faster, simpler way to run and test AI models locally, right from your existing workflow. Whether you’re experimenting with the latest LLMs or deploying to production, Model Runner brings the performance and control you need, without the friction. We’re also teaming up with some of the most influential names…
