Skip to main content

· 3 min read
Pajman Samadi

If you’re new to Ansible, it can feel overwhelming at first. Whether you’re trying to automate your infrastructure, deploy applications, or manage configurations, Ansible has a lot to offer. But don’t let that scare you away. With a little knowledge and practice, you too can become proficient in using Ansible. To help other beginners like myself, I’ve put together a repository on Github detailing everything I’ve learned about Ansible so far. In this blog post, I’ll summarize some of the key concepts and tips I’ve learned about this handy tool.

Understanding Ansible

At its core, Ansible is an open-source IT automation tool created by Red Hat. According to their website, Ansible “provides a simple, yet powerful, automation engine for tackling complex workflows and orchestrating seamless deployments.” The key advantage of Ansible is that it allows for automation without the need for special coding skills. With its simple YAML syntax, you can automate everything from deployment to configuration management to continuous delivery. One important concept to understand when working with Ansible is that it uses a declarative approach to configuration management. This means that instead of specifying step-by-step instructions for tasks, you declare the state that you want your infrastructure to be in, and Ansible will take care of the details for you. Ansible also uses a client-server architecture. The Ansible server (also known as the control machine) communicates with one or more Ansible clients (also known as managed nodes) to execute tasks. However, unlike other configuration management tools, Ansible does not require an agent to be installed on the client machines. Instead, it relies on SSH or WinRM protocols to establish a connection. Getting Started with Ansible To get started using Ansible, you’ll need to install it on your control machine. Ansible supports Linux, macOS, and Windows. Once you have Ansible installed, you can use it to execute tasks on one or more managed nodes. A critical part of Ansible is understanding the inventory. The inventory is a file that specifies the managed nodes and groups of nodes that Ansible will target. The inventory can be a static file or generated dynamically from sources such as cloud providers, LDAP, or DNS. One of the great features of Ansible is that it offers a large collection of pre-written tasks and modules called “Ansible Galaxy.” These modules can be easily installed and used in your playbook. Additionally, Ansible provides a powerful templating system (Jinja2) that allows you to personalize your task execution.

Advanced Ansible Techniques

As you become more familiar with Ansible, you can start to take advantage of its more advanced features. Some of the ways you can level up your Ansible skills include:

  • Using Ansible vault for encrypting sensitive data
  • Creating custom modules for specific tasks
  • Creating roles to group your tasks into reusable units
  • Utilizing Ansible playbooks for orchestration and customization
  • Creating dynamic inventories using APIs or other sources

Conclusion

Ansible is a powerful automation tool that can help you streamline your infrastructure and configuration management. With its declarative syntax and client-server architecture, Ansible is easy to use and can scale to meet the demands of even the largest infrastructure. By reading through my Github repository and utilizing the tips and concepts outlined in this blog post, you’ll be well on your way to mastering Ansible.

ui

· One min read
Pajman Samadi

I recently worked with some wonderful tools, such as Obsidian and MKDocs, to generate and serve documentation. I use these tools to write notes and documents for my current course. You may wish to take a look at the links below if you wish to learn more about iOS.

ui

· 2 min read
Pajman Samadi
summary
  • An AI model for lung segmentation in CXR images, deploying with FastAPI.
  • Use DVC for data version control, and dockerizing web-app.
  • Deploy projects w/ K8s.

This app uses FastAPI as backend. Check repository on GitHub.

Usage for app.py

First install required libraries by running:

pip install -r requirements.txt

To run the application run following command in src dir:

uvicorn app:app --reload

or

chmod +x app.sh
./app.sh

Tutorial for app.py

app.gif

Tutorial

main page

http://localhost:8000/

main.png

fastapi documentation

http://localhost:8000/docs

docs.png

show results

http://localhost:8000/imshow

imshow.png


DVC

pip install dvc dvc-gdrive

# pull weights from Google Drive
dvc pull

in weights directory

weights
├── cxr_resunet.tflite
├── cxr_resunet.tflite.dvc
├── cxr_unet.tflite
└── cxr_unet.tflite.dvc

Docker

# Build image
docker build -t IMAGE_NAME:TAG_NAME .
docker run -p 8000:8000 -d IMAGE_NAME:TAG_NAME

Or

# for amd64 systems
docker run -d -p 8000:8000 pejmans21/ls-fastapi:0.1.0

#### OR

# for arm64 systems
docker run -d -p 8000:8000 pejmans21/ls-fastapi:aarch64

Kubernetes

kubectl apply -f ls-fastapi-k8s-config.yaml

to see output

kubectl port-forward service/lsapi-service 8000

Now check http://127.0.0.1:8000/

Stop process

kubectl delete -f ls-fastapi-k8s-config.yaml

· One min read
Pajman Samadi
summary
  • Repository on GitHub.
  • A FaceRecognition module written with facenet pytorch.
  • Use Django as the web framework.
  • Dockerized project.

Module

from face_encoder import FaceEncoder

# Load the module
FE = FaceEncoder('/path/to/main_folder_people/', img_size=160, recognition_threshold=0.3)

# Read images in people folder to create a database
encodes_db = FE.db_prepare(show_face=True, save_file=True, encoding_file_path='./data/encodes_db.pt')

"""--- Different ways to send an input to get results ---"""

# Image path as string
image = FE.recognizer('/path/to/image', encodes_db, pil_write=True)
# PIL.Image
image = FE.recognizer(Image.open('/path/to/image'), encodes_db, pil_write=True)
# numpy image
image = FE.recognizer(np.array(unknown_image), encodes_db, pil_write=True)

Colab

For testing the module yourself, open the prepared jupyter notebook in colab via the following link

Open In Colab

Usage

First: install requirements.txt

$ pip3 install -r requirements.txt

then in your local computer to start Django server run following commands in your terminal:

1- makemigrations

$ python3 manage.py makemigrations

2- migrate

$ python3 manage.py migrate

Optional

  • createsuperuser
$ python3 manage.py createsuperuser

3- runserver

$ python manage.py runserver 0.0.0.0:8000

Docker

$ docker-compose up --build

UI

img