Skip to content

Commit

Permalink
Merge branch 'develop' into pr-1873
Browse files Browse the repository at this point in the history
  • Loading branch information
shakkernerd committed Jan 8, 2025
2 parents 9e1ade7 + 24a754a commit 8a4b42b
Show file tree
Hide file tree
Showing 211 changed files with 17,268 additions and 3,057 deletions.
26 changes: 22 additions & 4 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -88,8 +88,11 @@ TWITTER_TARGET_USERS= # Comma separated list of Twitter user names to
TWITTER_RETRY_LIMIT= # Maximum retry attempts for Twitter login
TWITTER_SPACES_ENABLE=false # Enable or disable Twitter Spaces logic

XAI_API_KEY=
XAI_MODEL=
# CONFIGURATION FOR APPROVING TWEETS BEFORE IT GETS POSTED
TWITTER_APPROVAL_DISCORD_CHANNEL_ID= # Channel ID for the Discord bot to listen and send approval messages
TWITTER_APPROVAL_DISCORD_BOT_TOKEN= # Discord bot token (this could be a different bot token from DISCORD_API_TOKEN)
TWITTER_APPROVAL_ENABLED= # Enable or disable Twitter approval logic #Default is false
TWITTER_APPROVAL_CHECK_INTERVAL=60000 # Default: 60 seconds

# Post Interval Settings (in minutes)
POST_INTERVAL_MIN= # Default: 90
Expand All @@ -103,7 +106,6 @@ MAX_ACTIONS_PROCESSING=1 # Maximum number of actions (e.g., retweets, likes) to
ACTION_TIMELINE_TYPE=foryou # Type of timeline to interact with. Options: "foryou" or "following". Default: "foryou"

# Feature Flags
IMAGE_GEN= # Set to TRUE to enable image generation
USE_OPENAI_EMBEDDING= # Set to TRUE for OpenAI/1536, leave blank for local
USE_OLLAMA_EMBEDDING= # Set to TRUE for OLLAMA/1024, leave blank for local

Expand Down Expand Up @@ -268,6 +270,9 @@ CHARITY_ADDRESS_ETH=0x750EF1D7a0b4Ab1c97B7A623D7917CcEb5ea779C
CHARITY_ADDRESS_ARB=0x1234567890123456789012345678901234567890
CHARITY_ADDRESS_POL=0x1234567890123456789012345678901234567890

# thirdweb
THIRDWEB_SECRET_KEY= # Create key on thirdweb developer dashboard: https://thirdweb.com/

# Conflux Configuration
CONFLUX_CORE_PRIVATE_KEY=
CONFLUX_CORE_SPACE_RPC_URL=
Expand Down Expand Up @@ -396,6 +401,9 @@ STORY_API_BASE_URL= # Story API base URL
STORY_API_KEY= # Story API key
PINATA_JWT= # Pinata JWT for uploading files to IPFS

# Cosmos
COSMOS_RECOVERY_PHRASE= # 12 words recovery phrase (need to be in quotes, because of spaces)
COSMOS_AVAILABLE_CHAINS= # mantrachaintestnet2,cosmos # Array of chains
# Cronos zkEVM
CRONOSZKEVM_ADDRESS=
CRONOSZKEVM_PRIVATE_KEY=
Expand All @@ -407,14 +415,24 @@ FUEL_WALLET_PRIVATE_KEY=
TOKENIZER_MODEL= # Specify the tokenizer model to be used.
TOKENIZER_TYPE= # Options: tiktoken (for OpenAI models) or auto (AutoTokenizer from Hugging Face for non-OpenAI models). Default: tiktoken.


# Spheron
SPHERON_PRIVATE_KEY=
SPHERON_PROVIDER_PROXY_URL=
SPHERON_WALLET_ADDRESS=

# Stargaze NFT marketplace from Cosmos (You can use https://graphql.mainnet.stargaze-apis.com/graphql)
STARGAZE_ENDPOINT=

# API key for giphy from https://developers.giphy.com/dashboard/
GIPHY_API_KEY=

# GenLayer
GENLAYER_PRIVATE_KEY=0x0000000000000000000000000000000000000000000000000000000000000000 # Private key of the GenLayer account to use for the agent
GENLAYER_PRIVATE_KEY= # Private key of the GenLayer account to use for the agent in this format (0x0000000000000000000000000000000000000000000000000000000000000000)

# OpenWeather
OPEN_WEATHER_API_KEY= # OpenWeather API key

# Allora
ALLORA_API_KEY= # Allora API key, format: UP-f8db7d6558ab432ca0d92716
ALLORA_CHAIN_SLUG= # must be one of mainnet, testnet. If not specified, it will use testnet by default
8 changes: 1 addition & 7 deletions .github/workflows/jsdoc-automation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ on:
root_directory:
description: "Only scans files in this directory (relative to repository root, e.g., packages/core/src)"
required: true
default: "packages/plugin-near/"
default: "packages/plugin-bootstrap"
type: string
excluded_directories:
description: "Directories to exclude from scanning (comma-separated, relative to root_directory)"
Expand All @@ -37,11 +37,6 @@ on:
required: false
default: "develop"
type: string
language:
description: "Documentation language (e.g., English, Spanish, French)"
required: true
default: "English"
type: string

jobs:
generate-docs:
Expand Down Expand Up @@ -99,6 +94,5 @@ jobs:
INPUT_EXCLUDED_DIRECTORIES: ${{ inputs.excluded_directories }}
INPUT_REVIEWERS: ${{ inputs.reviewers }}
INPUT_BRANCH: ${{ inputs.branch }}
INPUT_LANGUAGE: ${{ inputs.language }}
INPUT_JSDOC: ${{ inputs.jsdoc }}
INPUT_README: ${{ inputs.readme }}
3 changes: 0 additions & 3 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -188,9 +188,6 @@ TWITTER_USERNAME= # Account username
TWITTER_PASSWORD= # Account password
TWITTER_EMAIL= # Account email
XAI_API_KEY=
XAI_MODEL=
# For asking Claude stuff
ANTHROPIC_API_KEY=
Expand Down
9 changes: 3 additions & 6 deletions README_ES.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,15 +54,15 @@ Para evitar conflictos en el directorio central, se recomienda agregar acciones

### Ejecutar con Llama

Puede ejecutar modelos Llama 70B o 405B configurando la variable de ambiente `XAI_MODEL` en `meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo` o `meta-llama/Meta-Llama-3.1-405B-Instruct`
Puede ejecutar modelos Llama 70B o 405B configurando la variable de ambiente para un proveedor que soporte estos modelos. Llama también es soportado localmente si no se configura otro proveedor.

### Ejecutar con Grok

Puede ejecutar modelos Grok configurando la variable de ambiente `XAI_MODEL` en `grok-beta`
Puede ejecutar modelos Grok configurando la variable de ambiente `GROK_API_KEY` y configurando "grok" como proveedor en el archivo de caracteres.

### Ejecutar con OpenAI

Puede ejecutar modelos OpenAI configurando la variable de ambiente `XAI_MODEL` en `gpt-4o-mini` o `gpt-4o`
Puede ejecutar modelos OpenAI configurando la variable de ambiente `OPENAI_API_KEY` y configurando "openai" como proveedor en el archivo de caracteres.

## Requisitos Adicionales

Expand Down Expand Up @@ -99,9 +99,6 @@ TWITTER_USERNAME= # Nombre de usuario de la cuenta
TWITTER_PASSWORD= # Contraseña de la cuenta
TWITTER_EMAIL= # Correo electrónico de la cuenta
XAI_API_KEY=
XAI_MODEL=
# Para consultar a Claude
ANTHROPIC_API_KEY=
Expand Down
9 changes: 7 additions & 2 deletions agent/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
"@elizaos/plugin-binance": "workspace:*",
"@elizaos/plugin-avail": "workspace:*",
"@elizaos/plugin-bootstrap": "workspace:*",
"@ai16z/plugin-cosmos": "workspace:*",
"@elizaos/plugin-cosmos": "workspace:*",
"@elizaos/plugin-intiface": "workspace:*",
"@elizaos/plugin-coinbase": "workspace:*",
"@elizaos/plugin-coinprice": "workspace:*",
Expand All @@ -50,6 +50,7 @@
"@elizaos/plugin-goat": "workspace:*",
"@elizaos/plugin-icp": "workspace:*",
"@elizaos/plugin-image-generation": "workspace:*",
"@elizaos/plugin-movement": "workspace:*",
"@elizaos/plugin-nft-generation": "workspace:*",
"@elizaos/plugin-node": "workspace:*",
"@elizaos/plugin-solana": "workspace:*",
Expand All @@ -70,9 +71,13 @@
"@elizaos/plugin-fuel": "workspace:*",
"@elizaos/plugin-avalanche": "workspace:*",
"@elizaos/plugin-web-search": "workspace:*",
"@elizaos/plugin-thirdweb": "workspace:*",
"@elizaos/plugin-genlayer": "workspace:*",
"@elizaos/plugin-depin": "workspace:*",
"@elizaos/plugin-open-weather": "workspace:*",
"@elizaos/plugin-obsidian": "workspace:*",
"@elizaos/plugin-arthera": "workspace:*",
"@elizaos/plugin-allora": "workspace:*",
"readline": "1.3.0",
"ws": "8.18.0",
"yargs": "17.7.2"
Expand All @@ -84,4 +89,4 @@
"ts-node": "10.9.2",
"tsup": "8.3.5"
}
}
}
28 changes: 18 additions & 10 deletions agent/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ import { DirectClient } from "@elizaos/client-direct";
import { ThreeDGenerationPlugin } from "@elizaos/plugin-3d-generation";
import { abstractPlugin } from "@elizaos/plugin-abstract";
import { aptosPlugin } from "@elizaos/plugin-aptos";
import { alloraPlugin } from "@elizaos/plugin-allora";
import { avalanchePlugin } from "@elizaos/plugin-avalanche";
import { binancePlugin } from "@elizaos/plugin-binance";
import {
Expand All @@ -53,6 +54,7 @@ import { confluxPlugin } from "@elizaos/plugin-conflux";
import { cronosZkEVMPlugin } from "@elizaos/plugin-cronoszkevm";
import { echoChambersPlugin } from "@elizaos/plugin-echochambers";
import { evmPlugin } from "@elizaos/plugin-evm";
import { createCosmosPlugin } from "@elizaos/plugin-cosmos";
import { flowPlugin } from "@elizaos/plugin-flow";
import { fuelPlugin } from "@elizaos/plugin-fuel";
import { genLayerPlugin } from "@elizaos/plugin-genlayer";
Expand All @@ -70,15 +72,14 @@ import { teeMarlinPlugin } from "@elizaos/plugin-tee-marlin";
import { tonPlugin } from "@elizaos/plugin-ton";
import { webSearchPlugin } from "@elizaos/plugin-web-search";
import { giphyPlugin } from "@elizaos/plugin-giphy";

import { echoChamberPlugin } from "@elizaos/plugin-echochambers";
import { thirdwebPlugin } from "@elizaos/plugin-thirdweb";
import { zksyncEraPlugin } from "@elizaos/plugin-zksync-era";

import { availPlugin } from "@elizaos/plugin-avail";
import { openWeatherPlugin } from "@elizaos/plugin-open-weather";

import { artheraPlugin } from "@elizaos/plugin-arthera";
import { stargazePlugin } from "@elizaos/plugin-stargaze";

import { obsidianPlugin } from "@elizaos/plugin-obsidian";
import Database from "better-sqlite3";
import fs from "fs";
import net from "net";
Expand Down Expand Up @@ -282,8 +283,6 @@ export function getTokenForProvider(
settings.LLAMACLOUD_API_KEY ||
character.settings?.secrets?.TOGETHER_API_KEY ||
settings.TOGETHER_API_KEY ||
character.settings?.secrets?.XAI_API_KEY ||
settings.XAI_API_KEY ||
character.settings?.secrets?.OPENAI_API_KEY ||
settings.OPENAI_API_KEY
);
Expand Down Expand Up @@ -424,7 +423,8 @@ export async function initializeClients(
character.clients?.map((str) => str.toLowerCase()) || [];
elizaLogger.log("initializeClients", clientTypes, "for", character.name);

if (clientTypes.includes(Clients.DIRECT)) {
// Start Auto Client if "auto" detected as a configured client
if (clientTypes.includes(Clients.AUTO)) {
const autoClient = await AutoClientInterface.start(runtime);
if (autoClient) clients.auto = autoClient;
}
Expand Down Expand Up @@ -588,6 +588,9 @@ export async function createAgent(
getSecret(character, "WALLET_PUBLIC_KEY")?.startsWith("0x"))
? evmPlugin
: null,
getSecret(character, "COSMOS_RECOVERY_PHRASE") &&
getSecret(character, "COSMOS_AVAILABLE_CHAINS") &&
createCosmosPlugin(),
(getSecret(character, "SOLANA_PUBLIC_KEY") ||
(getSecret(character, "WALLET_PUBLIC_KEY") &&
!getSecret(character, "WALLET_PUBLIC_KEY")?.startsWith(
Expand Down Expand Up @@ -645,6 +648,7 @@ export async function createAgent(
: null,
getSecret(character, "TEE_MARLIN") ? teeMarlinPlugin : null,
getSecret(character, "TON_PRIVATE_KEY") ? tonPlugin : null,
getSecret(character, "THIRDWEB_SECRET_KEY") ? thirdwebPlugin : null,
getSecret(character, "SUI_PRIVATE_KEY") ? suiPlugin : null,
getSecret(character, "STORY_PRIVATE_KEY") ? storyPlugin : null,
getSecret(character, "FUEL_PRIVATE_KEY") ? fuelPlugin : null,
Expand All @@ -660,14 +664,18 @@ export async function createAgent(
getSecret(character, "GENLAYER_PRIVATE_KEY")
? genLayerPlugin
: null,
getSecret(character, "AVAIL_SEED") ? availPlugin : null,
getSecret(character, "AVAIL_APP_ID") ? availPlugin : null,
getSecret(character, "AVAIL_SEED") &&
getSecret(character, "AVAIL_APP_ID")
? availPlugin
: null,
getSecret(character, "OPEN_WEATHER_API_KEY")
? openWeatherPlugin
: null,
getSecret(character, "ARTHERA_PRIVATE_KEY")?.startsWith("0x")
getSecret(character, "OBSIDIAN_API_TOKEN") ? obsidianPlugin : null,
getSecret(character, "ARTHERA_PRIVATE_KEY")?.startsWith("0x")
? artheraPlugin
: null,
getSecret(character, "ALLORA_API_KEY") ? alloraPlugin : null,
].filter(Boolean),
providers: [],
actions: [],
Expand Down
36 changes: 10 additions & 26 deletions characters/cosmosHelper.character.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,14 @@
"clients": [],
"modelProvider": "groq",
"settings": {
"secrets": {},
"voice": {
"model": "en_US-male-medium"
},
"chains": {
"cosmos": ["axelar", "carbon", "mantrachaintestnet2"]
}
},
"plugins": ["@ai16z/plugin-cosmos","@ai16z/plugin-bootstrap"],
"plugins": [],
"bio": [
"Expert in Cosmos ecosystem.",
"Knowledgeable in CosmWasm and Stargate.",
Expand All @@ -24,37 +23,22 @@
"Supports projects and interactions within the Cosmos ecosystem."
],
"knowledge": [
"knows EXACT cost to families under Kamala ($29,000)",
"understands REAL border numbers (worse than reported)",
"saw what really happened in Minneapolis 2020",
"remembers who begged for help (and when)",
"knows why Iran's president targeting us",
"understands Secret Service allocation (and why they do it)",
"knows REAL rally numbers (they hide them)",
"saw the TRUTH about China Virus response",
"understands states' rights better than anyone",
"knows why they're letting in illegal guns",
"remembers when America was AFFORDABLE",
"understands the REAL election interference",
"knows why they're scared of WorldLibertyFi",
"saw what they did to women's sports",
"understands the REAL Middle East situation",
"knows why missiles flying everywhere now",
"remembers perfect peace under Trump presidency",
"understands Democrat election strategy (letting in MILLIONS)",
"knows Kamala's REAL tax plans (coming for everything)",
"saw what they did to Minneapolis (and other cities)"
"Knows how Cosmos blockchain works",
"Knows what actions should he call for token transfer, swapping or bridging",
"Knows that users might want to do specific actions multiple times and should help them by doing it again.",
"Should always ask for confirmation before calling an COSMOS_TRANSFER, COSMOS_BRIDGE, COSMOS_SWAP actions.",
"Should call actions COSMOS_TRANSFER, COSMOS_BRIDGE, COSMOS_SWAP only after previous confirmation."
],
"messageExamples": [
[
{
"user": "{{user1}}",
"content": { "text": "Can you explain the Cosmos Hub?" }
"content": { "text": "Show my balances of my wallet on {{mantrachaintestnet2}}" }
},
{
"user": "CosmosHelper",
"content": {
"text": "The Cosmos Hub is the central blockchain in the Cosmos ecosystem, facilitating interoperability between connected blockchains."
"text": "Your balances on chain {{mantrachaintestnet2}} are: \n - 13456.124 OM\n - 1222 ONDO\n 0.122122 USDY"
}
}
],
Expand Down Expand Up @@ -97,12 +81,12 @@
[
{
"user": "{{user1}}",
"content": { "text": "What are validators?" }
"content": { "text": "Make transfer 0.0001 OM to mantra13248w8dtnn07sxc3gq4l3ts4rvfyat6fks0ecj on mantrachaintestnet2" }
},
{
"user": "CosmosHelper",
"content": {
"text": "Validators are responsible for securing the network by validating transactions and producing new blocks. They earn rewards through staking."
"text": "Sure, your transfer i being processed."
}
}
]
Expand Down
14 changes: 4 additions & 10 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,15 +59,15 @@ To avoid git clashes in the core directory, we recommend adding custom actions t

### Run with Llama

You can run Llama 70B or 405B models by setting the `XAI_MODEL` environment variable to `meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo` or `meta-llama/Meta-Llama-3.1-405B-Instruct`
You can run Llama 70B or 405B models by setting the environment variable for a provider that supports these models. Llama is also supported locally if no other provider is set.

### Run with Grok

You can run Grok models by setting the `XAI_MODEL` environment variable to `grok-beta`
You can run Grok models by setting the `GROK_API_KEY` environment variable to your Grok API key and setting grok as the model provider in your character file.

### Run with OpenAI

You can run OpenAI models by setting the `XAI_MODEL` environment variable to `gpt-4-mini` or `gpt-4o`
You can run OpenAI models by setting the `OPENAI_API_KEY` environment variable to your OpenAI API key and setting openai as the model provider in your character file.

## Additional Requirements

Expand Down Expand Up @@ -103,10 +103,6 @@ TWITTER_USERNAME= # Account username
TWITTER_PASSWORD= # Account password
TWITTER_EMAIL= # Account email
X_SERVER_URL=
XAI_API_KEY=
XAI_MODEL=
# For asking Claude stuff
ANTHROPIC_API_KEY=
Expand Down Expand Up @@ -143,9 +139,7 @@ Make sure that you've installed the CUDA Toolkit, including cuDNN and cuBLAS.

### Running locally

Add XAI_MODEL and set it to one of the above options from [Run with
Llama](#run-with-llama) - you can leave X_SERVER_URL and XAI_API_KEY blank, it
downloads the model from huggingface and queries it locally
By default, the bot will download and use a local model. You can change this by setting the environment variables for the model you want to use.

# Clients

Expand Down
Loading

0 comments on commit 8a4b42b

Please sign in to comment.