DISCORD DEVELOPER PORTAL

DISCORD DEVELOPER PORTAL

458 bookmarks
Newest
Discord Developer Portal
Discord Developer Portal
Back to Applications SELECTED APP Quarter Quest SETTINGS General Information Installation NEW OAuth2 Bot Emojis NEW Webhooks NEW Rich Presence App Testers App Verification ACTIVITIES Settings URL Mappings Art Assets MONETIZATION Getting Started General Information What should we call your creation? What amazing things does it do? What icon should represent it across Discord? Tell us here! By clicking Create, you agree to the Discord Developer Terms of Service and Developer Policy. APP ICON Remove NAME DESCRIPTION (MAXIMUM 400 CHARACTERS) Your description will appear in the About Me section of your bot's profile. TAGS (MAXIMUM 5) Add up to 5 tags to describe the content and functionality of your application. APPLICATION ID 1130437243639705651 Copy PUBLIC KEY d07a1bd76c84562c9bc997e057930105ceaf0eff9d5a107734f2511db83059cb Copy INSTALL COUNT This is an approximated number of servers and users that have installed your application. These numbers are updated daily. 0 Servers 0 Individual Users INTERACTIONS ENDPOINT URL You can optionally configure an interactions endpoint to receive interactions via HTTP POSTs rather than over Gateway with a bot user. LINKED ROLES VERIFICATION URL You can configure a verification URL to enable your application as a requirement in a server role's Links settings TERMS OF SERVICE URL A link to your application's Terms of Service PRIVACY POLICY URL A link to your application's Privacy Policy Delete App
·discord.com·
Discord Developer Portal
Wrangler · Cloudflare Workers docs
Wrangler · Cloudflare Workers docs
Wrangler Wrangler, the Cloudflare Developer Platform command-line interface (CLI), allows you to manage Worker projects. API : A set of programmatic APIs that can be integrated with local Cloudflare Workers-related workflows. Bundling : Review Wrangler's default bundling. Commands : Create, develop, and deploy your Cloudflare Workers with Wrangler commands. Configuration : Use a configuration file to customize the development and deployment setup for your Worker project and other Developer Platform products. Custom builds : Customize how your code is compiled, before being processed by Wrangler. Deprecations : The differences between Wrangler versions, specifically deprecations and breaking changes. Environments : Deploy the same Worker application with different configuration for each environment. Install/Update Wrangler : Get started by installing Wrangler, and update to newer versions by following this guide. Migrations : Review migration guides for specific versions of Wrangler. System environment variables : Local environment variables that can change Wrangler's behavior. Was this helpful? Edit page Cloudflare Dashboard
·developers.cloudflare.com·
Wrangler · Cloudflare Workers docs
awesome-telegram/cookiecutter-newbie/README.md at e9f1b516d9b50fdb0062689bcd3f84eca6ebfa51 · Cybersoulja/awesome-telegram
awesome-telegram/cookiecutter-newbie/README.md at e9f1b516d9b50fdb0062689bcd3f84eca6ebfa51 · Cybersoulja/awesome-telegram
Cookiecutter Create projects swiftly from cookiecutters (project templates) with this command-line utility. Ideal for generating Python package projects and more. Documentation GitHub PyPI License (BSD) Installation Install cookiecutter using pip package manager: # pipx is strongly recommended. pipx install cookiecutter # If pipx is not an option, # you can install cookiecutter in your Python user directory. python -m pip install --user cookiecutter Features Cross-Platform: Supports Windows, Mac, and Linux. User-Friendly: No Python knowledge required. Versatile: Compatible with Python 3.7 to 3.12. Multi-Language Support: Use templates in any language or markup format. For Users Quick Start The recommended way to use Cookiecutter as a command line utility is to run it with pipx, which can be installed with pip install pipx, but if you plan to use Cookiecutter programmatically, please run pip install cookiecutter. Use a GitHub template # You'll be prompted to enter values. # Then it'll create your Python package in the current working directory, # based on those values. # For the sake of brevity, repos on GitHub can just use the 'gh' prefix $ pipx run cookiecutter gh:audreyfeldroy/cookiecutter-pypackage Use a local template $ pipx run cookiecutter cookiecutter-pypackage/ Use it from Python from cookiecutter.main import cookiecutter # Create project from the cookiecutter-pypackage/ template cookiecutter('cookiecutter-pypackage/') # Create project from the cookiecutter-pypackage.git repo template cookiecutter('gh:audreyfeldroy//cookiecutter-pypackage.git') Detailed Usage Generate projects from local or remote templates. Customize projects with cookiecutter.json prompts. Utilize pre-prompt, pre- and post-generate hooks. Learn More For Template Creators Utilize unlimited directory nesting. Employ Jinja2 for all templating needs. Define template variables easily with cookiecutter.json. Learn More Available Templates Discover a variety of ready-to-use templates on GitHub. Special Templates cookiecutter-pypackage cookiecutter-django cookiecutter-pytest-plugin cookiecutter-plone-starter Community Join the community, contribute, or seek assistance. Troubleshooting Guide Stack Overflow Discord File an Issue Contributors Contribution Guide Support Star us on GitHub. Stay tuned for upcoming support options. Feedback We value your feedback. Share your criticisms or complaints constructively to help us improve. File an Issue Waiting for a Response? Be patient and consider reaching out to the community for assistance. For urgent matters, contact @audreyfeldroy for consultation or custom development. Code of Conduct Adhere to the PyPA Code of Conduct during all interactions in the project's ecosystem. Acknowledgements Created and led by Audrey Roy Greenfeld, supported by a dedicated team of maintainers and contributors.
·github.com·
awesome-telegram/cookiecutter-newbie/README.md at e9f1b516d9b50fdb0062689bcd3f84eca6ebfa51 · Cybersoulja/awesome-telegram
Introducing `lms` - LM Studio's companion cli tool | LM Studio
Introducing `lms` - LM Studio's companion cli tool | LM Studio
Introducing `lms` - LM Studio's companion cli tool May 2, 2024 By LM Studio Team Today, alongside LM Studio 0.2.22, we're releasing the first version of lms — LM Studio's companion cli tool. With lms you can load/unload models, start/stop the API server, and inspect raw LLM input (not just output). It's developed on github and we're welcoming issues and PRs from the community. lms ships with LM Studio and lives in LM Studio's working directory, under ~/.cache/lm-studio/bin/. When you update LM Studio, it also updates your lms version. If you're a developer, you can also build lms from source. Bootstrap lms on your system​ You need to run LM Studio at least once before you can use lms. Afterwards, open your terminal and run one of these commands, depending on your operating system: # Mac / Linux: ~/.cache/lm-studio/bin/lms bootstrap # Windows: cmd /c %USERPROFILE%/.cache/lm-studio/bin/lms.exe bootstrap Afterwards, open a new terminal window and run lms. This is the current output you will get: $ lms __ __ ___ ______ ___ _______ ____ / / / |/ / / __/ /___ _____/ (_)__ / ___/ / / _/ / /__/ /|_/ / _\ \/ __/ // / _ / / _ \ / /__/ /___/ / /____/_/ /_/ /___/\__/\_,_/\_,_/_/\___/ \___/____/___/ lms - LM Studio CLI - v0.2.22 GitHub: https://github.com/lmstudio-ai/lmstudio-cli Usage lms <subcommand> where <subcommand> can be one of: - status - Prints the status of LM Studio - server - Commands for managing the local server - ls - List all downloaded models - ps - List all loaded models - load - Load a model - unload - Unload a model - create - Create a new project with scaffolding - log - Log operations. Currently only supports streaming logs from LM Studio via `lms log stream` - version - Prints the version of the CLI - bootstrap - Bootstrap the CLI For more help, try running `lms <subcommand> --help` lms is MIT Licensed and it is developed in this repository on GitHub: https://github.com/lmstudio-ai/lms Use lms to automate and debug your workflows​ Start and stop the local server​ lms server start lms server stop List the local models on the machine​ lms ls This will reflect the current LM Studio models directory, which you set in 📂 My Models tab in the app. List the currently loaded models​ lms ps Load a model (with options)​ lms load [--gpu=max|auto|0.0-1.0] [--context-length=1-N] --gpu=1.0 means 'attempt to offload 100% of the computation to the GPU'. Optionally, assign an identifier to your local LLM: lms load TheBloke/phi-2-GGUF --identifier="gpt-4-turbo" This is useful if you want to keep the model identifier consistent. Unload models​ lms unload [--all] Debug your prompting with lms log stream​ lms log stream allows you to inspect the exact input string that goes to the model. This is particularly useful for debugging prompt template issues and other unexpected LLM behaviors. $ lms log stream I Streaming logs from LM Studio timestamp: 5/2/2024, 9:49:47 PM type: llm.prediction.input modelIdentifier: TheBloke/TinyLlama-1.1B-1T-OpenOrca-GGUF/tinyllama-1.1b-1t-openorca.Q2_K.gguf modelPath: TheBloke/TinyLlama-1.1B-1T-OpenOrca-GGUF/tinyllama-1.1b-1t-openorca.Q2_K.gguf input: "Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: Hello, what's your name? ### Response: " lmstudio.js​ lms uses lmstudio.js to interact with LM Studio. You can build your own programs that can do what lms does and much more. lmstudio.js is in pre-release public alpha. Follow along on GitHub: https://github.com/lmstudio-ai/lmstudio.js. Discuss all things lms and lmstudio.js in the new #dev-chat channel on the LM Studio Discord Server. Download LM Studio for Mac / Windows / Linux from https://lmstudio.ai. LM Studio 0.2.22 AMD ROCm - Technology Preview is available in https://lmstudio.ai/rocm LM Studio on Twitter: https://twitter.com/LMStudioAI Older Post Use Llama 3 in LM Studio Docs Docs Community Discord Twitter More Download LM Studio Copyright © 2024 LM Studio
·lmstudio.ai·
Introducing `lms` - LM Studio's companion cli tool | LM Studio