In the first part of our article series, we explored the tension between Artificial Intelligence and sustainability: How can we ensure that AI is not just resource-intensive, but also contributes to solving ecological problems?
In this second part, we turn to the practical application: How can ChatGPT be used in everyday development to produce sustainable, energy-efficient code? We take a closer look at current research, analyze the strengths and weaknesses of LLMs, and show how these tools can already be tailored for Green Coding today—and how to put this potential to use right now.
Before diving into specific implementation methods in ChatGPT, we need to ask a fundamental question: Can a language model like ChatGPT even generate energy-efficient code—and if so, under what conditions? This potential needs to be confirmed before considering appropriate techniques and contexts for practical use. Let’s take a look at current research findings.
Can ChatGPT actually produce energy-efficient code?
The answer is: Yes—under certain conditions. But not automatically. A key source here is the study “Generating Energy‑efficient Code with LLMs” (Tom Cappendijk et al., 2024). In a controlled experiment, five large LLMs (including CodeLlama‑70b and DeepSeek‑Coder‑33b) were tasked with solving three different LeetCode problems. The prompts ranged from neutral instructions (“Solve this problem”) to specific phrasing such as “Give me an energy‑optimized solution…” or “Use library functions.”
The study concludes that energy savings through LLMs are generally possible—but not consistently. In some test scenarios, certain combinations of model, task type, and targeted prompting led to significant improvements in energy efficiency. For example, the use of efficient library functions or optimized loop structures resulted in savings of up to 60%. At the same time, there was no prompt that consistently produced more efficient code; in some cases, the models even generated considerably less efficient code, especially for more complex tasks.
The decisive factor is prompting: Only when the prompt explicitly references energy efficiency or specific coding practices does the likelihood increase that the generated solution will be resource-conscious. Without such guidance, the energy efficiency of the output remains largely unpredictable.
The study shows: LLMs like ChatGPT can generate energy-efficient code—if guided accordingly. What matters are clear prompts and a meaningful context. Without them, the potential remains untapped.
AI-powered Refactoring: What the Research Tells Us
A major shortcoming of existing approaches is their focus on evaluating AI-generated code. They assess how energy-efficient a given output is—but offer no concrete method for automatically improving existing code. This is where a research team from Purdue University (Peng et al., 2024) goes a step further: They developed a prototype for an automated refactoring system, in which two LLMs collaborate to systematically optimize existing code for energy efficiency.
Here’s how it works: A first model (e.g., GPT-4) analyzes the input code and suggests an optimized version designed to use fewer resources. A second model evaluates this suggestion—checking for functional correctness and measuring actual energy consumption using embedded telemetry and runtime data. Based on this feedback, the first model can then iteratively improve its suggestions.
The result: In about half of the test cases, energy consumption was significantly reduced—sometimes to less than half of the original level. Particularly notable: In several cases, the refactoring suggestions even outperformed what traditional compiler optimizations achieved.
This approach clearly highlights the potential of LLMs in sustainable software maintenance: Instead of merely generating efficient new code, they could also adapt existing code to meet modern efficiency standards—a promising outlook for future tools.
Green Coding Guidelines: How to Train ChatGPT with Context
To unlock the potential of LLMs like ChatGPT for everyday development, several proven strategies are already available. While research shows that LLMs do not inherently produce energy-efficient code, their output can be greatly improved through targeted prompting and contextual guidance.
If you want ChatGPT to consistently focus on energy efficiency in code generation, there are multiple ways to make that happen:
- Persistent Custom Instructions
With custom instructions (accessible via the settings menu in the ChatGPT app, available only for ChatGPT Plus/Enterprise), you can store permanent prompts that are automatically applied in every conversation.
These instructions are used across contexts – regardless of the project or topic at hand. - Upload style guides or Markdown files
GPT-4 supports file uploads. This allows you to provide your own guidelines directly as input – for example as `green_coding_guidelines.md` – and explicitly ask the model to follow them. - Create your own GPTs with system prompts
Using the "Explore" feature, you can build specialized GPTs that consistently follow Green Coding goals – including fixed prompts and optional knowledge files. - Be clear and structured
Rules should be positive, precise, and written in bullet points. Reinforcers such as “always” or “definitely” help the model prioritize effectively.
Good prompting is key—and sometimes a single sentence is enough to noticeably improve code suggestions. A prompt that has proven effective in practice reads:
“Act as an experienced software developer. Review the following code for performance, maintainability, and energy efficiency. Provide specific, actionable suggestions for improvement in each of these areas.”
This prompt is taken from a practical article by (Ash Explained, 2024) and can be easily adapted. Developers looking to systematically generate or review more efficient code can embed such prompts directly or apply them as persistent settings.
Green Coding Checklist (for Embedding or Uploading)
Here is a list of Green Coding Principles that can be incorporated in various ways within ChatGPT: directly in prompts, as system instructions in a custom GPT, or as a Markdown file uploaded during a conversation. This checklist can of course be refined or extended for project-specific needs:
Green Coding Guidelines for Energy-Efficient Software
General Principles
- Use efficient algorithms & data structures
- Avoid redundant computations
- Apply caching & memoization
- Avoid busy-waiting
- Keep memory usage low
- Parallelize tasks sensibly
Python-Specific
- Use vectorized NumPy operations instead of loops
- Prefer generators over lists
- Read files line by line
- Avoid object instantiation inside loops
Java-Specific
- Avoid object allocations in hot loops
- Use streams instead of full data loads
- Prefer primitive types
- Mark constants with
final
Cross-Technology
- Move loop conditions outside the loop
- Use optimized libraries
- Choose suitable languages/platforms
Conclusion
Language models like ChatGPT can be a powerful tool for sustainable programming—provided they are used strategically. Research shows: Energy-efficient code does not happen by default. It requires clear instructions, thoughtful context, and the integration of green coding principles into the development process. By leveraging custom instructions, prompting strategies, or tailored style guides, developers can start using the potential of LLMs for sustainable software development today—reducing resource consumption while improving code quality. For more background on the intersection of AI and sustainability, check out the first part of our article series.