Compare commits

..

No commits in common. "main" and "v2.0" have entirely different histories.
main ... v2.0

224 changed files with 779 additions and 11687 deletions

1
.gitignore vendored
View File

@ -1 +0,0 @@
local/build-temp/*

7
LICENSE Normal file
View File

@ -0,0 +1,7 @@
This Program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; version 2 of the License.
This Program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with this Program; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA.
In addition, as a special exception, Red Hat, Inc. gives You the additional right to link the code of this Program with code not covered under the GNU General Public License ("Non-GPL Code") and to distribute linked combinations including the two, subject to the limitations in this paragraph. Non-GPL Code permitted under this exception must only link to the code of this Program through those well defined interfaces identified in the file named EXCEPTION found in the source code files (the "Approved Interfaces"). The files of Non-GPL Code may instantiate templates or use macros or inline functions from the Approved Interfaces without causing the resulting work to be covered by the GNU General Public License. Only Red Hat, Inc. may make changes or additions to the list of Approved Interfaces. You must obey the GNU General Public License in all respects for all of the Program code and other code used in conjunction with the Program except the Non-GPL Code covered by this exception. If you modify this file, you may extend this exception to your version of the file, but you are not obligated to do so. If you do not wish to provide this exception without modification, you must delete this exception statement from your version and license this file solely under the GPL without exception.

View File

@ -1,28 +1,27 @@
# ReachableCEO Fulltime Resume # ReachableCEOResume
## Introduction ## Introduction
This is the resume Charles N Wyble (@ReachableCEO) uses for seeking fulltime employment for systems engineering/SRE/devops roles. Resume formatting/publication/management is difficult, tedious, annoying etc. The @ReachableCEO has hacked the process and made it easy!
It contains the markdown/csv files that represent his production resume for fulltime employment. ## Directory Overview
He also has a resume for consulting/freelance work: [here](https://git.knownelement.com/reachableceo/ReachableCEOConsultantResume) and it uses the same structure/process as this one. Two versions of the @ReachableCEO resume exist:
Two core artifacts are output from those files by this process: - non-cv
- cv
- PDF/MSWord Resume (auto parsed/populated by all job boards) Elements that are common to both:
- PDF Candidate information sheet (streamlining the initial lead (dis)qualification process.)
## Production use - contact info
- education
This resume is live on all the job portals and at: are in the common directory
- [My hiring site](https://resume.reachableceo.com) ## Build pipeline
## Supply chain to make your own See the build.sh script in the non-cv/cv directories.
Please see [MarkdownResum-Pipeline-ClientExample repository](https://git.knownelement.com/reachableceo/MarkdownResume-Pipeline-ClientExample) if you would like to manage your resume the same way I do. ## Skills
Clone that repository and follow the instructions. Edit the skills.csv file in common. The build script will turn that into a table in the generated PDF.
That repository has the [MarkdownResume-Pipeline](https://git.knownelement.com/reachableceo/MarkdownResume-Pipeline) vendored into it.

View File

@ -0,0 +1,12 @@
title: "Charles N Wyble Candidate Details"
titlepage: true
titlepage-logo: "D:/tsys/@ReachableCEO/ReachableCEO.png"
date: \today
header-left: "\\hspace{1cm}"
header-center: "\\leftmark"
header-right: "Page \\thepage"
footer-left: "Charles N Wyble"
footer-center: "Tenacity. Velocity. Focus."
footer-right: "[Source code for this file](https://github.com/ReachableCEO/ReachableCEOResume/blob/main/ancillary-support-files/CharlesNWybleCandidateInfo.md)"
urlcolor: blue
page-background: "D:/tsys/@ReachableCEO/ExternalVendorCode/pandoc-latex-template/examples/page-background/backgrounds/background1.pdf"

View File

@ -0,0 +1,53 @@
Hello,
Thank you for writing me regarding this role. I am interested in moving forward with the role.
Please feel free to send me an RTR with the best rate you can offer. You may find my latest resume at the following URL:
- [Short resume](https://resume.reachableceo.com/non-cv/CharlesNWybleShortResume.pdf)
- [Long resume](https://resume.reachableceo.com/cv/CharlesNWybleCV.pdf)
Feel free to use either/both for my submission to the job as you feel appropriate.
I am open to w2, corp to corp (I have my own LLC), 1099.
I WILL NOT share my ID over email or any other electronic written communication. I am happy to get on a teams/zoom/google meet etc call and show my ID.
Here are answers to common questions you may have:
| Question | Answer |
|---------------------------------------|----------------------------------------------------------------------------------------|
| Full name | Charles Wyble |
| E-mail address | reachableceo@reachableceo.com |
| Phone number | 818-280-7059 |
| Work authorization | US Citizen |
| Are you employed presently? | No |
| Current location | Austin, Texas |
| Availability to interview | Immediate |
| Availability to start | Immediate for remote/local, two weeks for relocation |
| Open to in-office/hybrid/remote | Yes |
| Any trips planned in next six months? | No |
| Highest Education | High School |
| Graduated Year | 2002 |
| Name of school | Osborne Christian School |
| Location of school | Los Angeles CA |
| Linkedin Profile | [Linkedin Profile](https://www.linkedin.com/in/charles-wyble-412007337/) |
| Github Profile | [Github Profile](https://www.github.com/ReachableCEO/) |
| Last project | Contract, ended October 2024 |
| DOB | 09/14 |
| Pay expectation | Open to discuss, send RTR with best hourly/annual rate as appropriate for the position |
| Total IT/career experience | 22 years |
Re relocation:
| Question | Answer |
|--------------------------------------|--------|
| Open to relocation? | Yes |
| Willing to re-locate at own expense? | No |
Please be aware that:
- net amount of the re-location benefit **MUST be $4,000 USD** or more to fully compensate me for the time/effort to relocate
- the full relocation benefit **must be provided 1 week or more prior to the confirmed start date**
- I will **only re-locate at the employer expense**
- I will need two weeks of time to re-locate

View File

@ -0,0 +1,11 @@
#!/bin/bash
echo "Generating PDF..."
pandoc \
CharlesNWybleCandidateInfo.md \
--template eisvogel \
--metadata-file=./CandidateInfo.yml \
--from markdown \
--to=pdf \
--output /d/tsys/@ReachableCEO/resume.reachableceo.com/candidate-info/CharlesNWybleCandidateInfo.pdf

6
common/Contact-Info.md Normal file
View File

@ -0,0 +1,6 @@
Charles N Wyble
=====
Senior (**Staff level**) **System Engineer/SRE/Architect** with extensive Linux/Windows/Networking/Cyber security background and experience
[ [Github Profile](https://github.com/reachableceo) ] . [ [Linkedin Profile](https://www.linkedin.com/in/charles-wyble-412007337) ] . [ reachableceo@reachableceo.com ] . [ 818 280 7059 ] . [ Austin TX / Raleigh NC / Remote ]

4
common/Education.md Normal file
View File

@ -0,0 +1,4 @@
## Education
High School Graduate

11
common/WorkHistory.csv Normal file
View File

@ -0,0 +1,11 @@
CDK Global, Senior System Engineer, July 2024 - October 2024
Apple Computer, Senior System Administrator, March 2024 - July 2024
SHEIN.com, Staff Site Reliability Engineer, December 2022 - August 2023
3M, Senior Site Reliability Engineer , March 2020 - November 2022
TippingPoint, Staff System/Network Architect, March 2012 - June 2019
HostGator.com, Automation and Escalation Engineer, March 2011 - May 2012
RippleTV, Systems Engineer, October 2008 - Jaunary 2010
Walt Disney Internet Group, Site Reliability Engineer, August 2006 - September 2007
Electronic Clearing House, Senior Linux System Administrator, April 2005 - July 2006
GSI Commerce, HPUX/Windows/Linux System Administrator, March 2002 - February 2005
ReachableCEO Enterprises, Freelancer, January 2001 - December 2024
1 CDK Global Senior System Engineer July 2024 - October 2024
2 Apple Computer Senior System Administrator March 2024 - July 2024
3 SHEIN.com Staff Site Reliability Engineer December 2022 - August 2023
4 3M Senior Site Reliability Engineer March 2020 - November 2022
5 TippingPoint Staff System/Network Architect March 2012 - June 2019
6 HostGator.com Automation and Escalation Engineer March 2011 - May 2012
7 RippleTV Systems Engineer October 2008 - Jaunary 2010
8 Walt Disney Internet Group Site Reliability Engineer August 2006 - September 2007
9 Electronic Clearing House Senior Linux System Administrator April 2005 - July 2006
10 GSI Commerce HPUX/Windows/Linux System Administrator March 2002 - February 2005
11 ReachableCEO Enterprises Freelancer January 2001 - December 2024

3
cv/3M.md Normal file
View File

@ -0,0 +1,3 @@
- Supported Vendavo on RedHat Linux, managed releases, and provided day-to-day developer support.
- Created a homegrown YAML conflguration management system, utilizing bash scripting and YAML templates with a CSV-based key/value store to eThciently manage and regenerate environment-speciflc variables for a line-of-business application across multiple development, testing, staging, and production environments.

2
cv/Apple Computer.md Normal file
View File

@ -0,0 +1,2 @@
- Day to day server operations scheduling downtime etc

View File

@ -1,2 +1,3 @@
- Security Compliance : worked with risk management/audit to remediate insecure configurations - Security Compliance : worked with risk management/audit to remediate insecure configurations
- Created Alma9 Packer image from scratch - Created Alma9 Packer image from scratch

4
cv/Dell Residency.md Normal file
View File

@ -0,0 +1,4 @@
- Rolled out centralized Active Directory authentication, deployed Dell OpenManage, and upgraded network equipment.
- Deployed password vault, Active Directory PKI, and implemented a ground-up network redesign.
- Designed VmWare NSX network.

View File

@ -1,3 +1,4 @@
- Linux systems engineer in a 24x7 transaction processing/ecommerce/flnancial services environment, collaborating with network administration and infrastructure design teams. - Linux systems engineer in a 24x7 transaction processing/ecommerce/flnancial services environment, collaborating with network administration and infrastructure design teams.
- Ensured continuous uptime for high-impact environments, including a 1TB MySQL database, 300TB Oracle database, 1.5TB Oracle Data Warehouse, and a 4,000-store LAMP-based ecommerce system (MerchantAmerica.com). - Ensured continuous uptime for high-impact environments, including a 1TB MySQL database, 300TB Oracle database, 1.5TB Oracle Data Warehouse, and a 4,000-store LAMP-based ecommerce system (MerchantAmerica.com).
- Successfully deployed an enterprise-wide Linux backup system, featuring encrypted backups stored on a central server with ISCSI attached network storage, utilizing GNUPG and tar over SSH. Regular backups and restores were tested weekly. - Successfully deployed an enterprise-wide Linux backup system, featuring encrypted backups stored on a central server with ISCSI attached network storage, utilizing GNUPG and tar over SSH. Regular backups and restores were tested weekly.

View File

@ -1,3 +1,4 @@
- Contributed to disaster recovery from an Informix Database failure. - Contributed to disaster recovery from an Informix Database failure.
- Streamlined FTP server conflguration and deployed open-source remote control software. - Streamlined FTP server conflguration and deployed open-source remote control software.
- Upgraded network infrastructure from hubs to managed switches. - Upgraded network infrastructure from hubs to managed switches.

View File

@ -1,2 +1,4 @@
- Provided senior-level Linux and web application support globally. - Provided senior-level Linux and web application support globally.
- Developed standardized reply language and scripts, reducing errors in Level I Linux administrator department. - Developed standardized reply language and scripts, reducing errors in Level I Linux administrator department.

12
cv/HumanOutput-CV.yml Normal file
View File

@ -0,0 +1,12 @@
title: "Charles N Wyble Resume"
titlepage: true
titlepage-logo: "D:/tsys/@ReachableCEO/ReachableCEO.png"
date: \today
header-left: "\\hspace{1cm}"
header-center: "\\leftmark"
header-right: "Page \\thepage"
footer-left: "Charles N Wyble"
footer-center: "Tenacity. Velocity. Focus."
footer-right: "[Source code for this resume](https://git.knownelement.com/reachableceo/ReachableCEOResume) "
urlcolor: blue
page-background: "D:/tsys/@ReachableCEO/ExternalVendorCode/pandoc-latex-template/examples/page-background/backgrounds/background5.pdf"

0
cv/MachineOutput.yml Normal file
View File

View File

@ -1,2 +1,3 @@
- Provided system engineering expertise for customer-facing advertising platform (AdSpot) and internal fleet management tool (CPanel). - Provided system engineering expertise for customer-facing advertising platform (AdSpot) and internal fleet management tool (CPanel).
- Utilized Nginx, Mongrel, Thin, Rails, Merb, Rack, MySQL, memcached, and F5 LTM. - Utilized Nginx, Mongrel, Thin, Rails, Merb, Rack, MySQL, memcached, and F5 LTM.

View File

@ -1,3 +1,4 @@
- Streamlined engineer onboarding by documenting and overhauling the process, consolidating disparate guides into a comprehensive modular set of documents. - Streamlined engineer onboarding by documenting and overhauling the process, consolidating disparate guides into a comprehensive modular set of documents.
- Established a taxonomy for team documentation in the wiki, implementing Confluence best practices for a proper knowledge base. - Established a taxonomy for team documentation in the wiki, implementing Confluence best practices for a proper knowledge base.
- Served as the SRE Liaison for cybersecurity functions, ensuring compliance with data locality/partition requirements and pending federal data privacy legislation. - Served as the SRE Liaison for cybersecurity functions, ensuring compliance with data locality/partition requirements and pending federal data privacy legislation.

View File

@ -1,3 +1,4 @@
- Conducted code and design reviews for internal/external team projects and actively participated in broad enterprise collaboration, focusing on large-scale fleet management. - Conducted code and design reviews for internal/external team projects and actively participated in broad enterprise collaboration, focusing on large-scale fleet management.
- Managed user account administration, manual/semi-automated server provisioning, trouble tickets, security vulnerability remediation, and system/network auditing. - Managed user account administration, manual/semi-automated server provisioning, trouble tickets, security vulnerability remediation, and system/network auditing.
- Led various projects, including migrating fleet systems from Centos 6 to Centos 7, implementing LXC/ LXD container versions for increased system utilization, and creating an on-premise deployment system (GUMPS) for automated provisioning. - Led various projects, including migrating fleet systems from Centos 6 to Centos 7, implementing LXC/ LXD container versions for increased system utilization, and creating an on-premise deployment system (GUMPS) for automated provisioning.

View File

@ -0,0 +1,8 @@
- Engaged in dynamic work environment focusing on popular web/ecommerce sites, including disneyworld.com and disneyland.com.
- Provided design, architecture, and day-to-day administration for Disney park property sites generating $2 billion annually.
- Offered system engineering and PCI compliance expertise for Disneyland.com, Disneyworld.com, DisneyCruiseLines.com, and 23 related properties.
- Utilized Jboss instances on RHEL3/4 for business logic and employed Windows 2003 with Tomcat/IIS for frontend application serving.
- Automated routine system administration tasks through the creation of batch and VBScript programs for Windows administration.
- Led an Active Directory project for WDIG, designing and implementing a nationwide, highly available system across 3 data centers.
- Managed the migration from Windows NT to Windows 2003 Active Directory domain controllers, including experience with Windows 2008, Centrify, and Samba/Winbind/LDAP/Kerberos.

View File

@ -0,0 +1,54 @@
#!/bin/bash
#####################################
# Human readable CV
#####################################
HumanIntermediateOutputFile="./output/intermediate/human/CharlesNWybleCV.md"
rm $HumanIntermediateOutputFile
# Combine markdown files into single input file for pandoc
#Pull in my contact info
cat "../common/Contact-Info.md" >> $HumanIntermediateOutputFile
echo " " >> $HumanIntermediateOutputFile
echo "## Employment History" >> $HumanIntermediateOutputFile
#And here we do some magic...
#Pull in my :
# employer
# title
# start/end dates of employment
# long form position summary data from each position
IFS=$'\n\t'
for position in \
$(cat ../common/WorkHistory.csv); do
COMPANY="$(echo $position|awk -F ',' '{print $1}')"
TITLE="$(echo $position|awk -F ',' '{print $2}')"
DATEOFEMPLOY="$(echo $position|awk -F ',' '{print $3}')"
echo " " >> "$HumanIntermediateOutputFile"
echo "**$COMPANY** | $TITLE | $DATEOFEMPLOY" >> $HumanIntermediateOutputFile
echo " " >> "$HumanIntermediateOutputFile"
cat ./$COMPANY.md >> "$HumanIntermediateOutputFile"
echo " " >> "$HumanIntermediateOutputFile"
done
unset IFS
# Run pandoc/etc to generate HTML/PDF/DOC into output dir
#First html/pdf/doc, for resume.reachableceo.com use
pandoc \
$HumanIntermediateOutputFile \
--template eisvogel \
--metadata-file=./HumanOutput-CV.yml \
--from markdown \
--to=pdf \
--output /d/tsys/@ReachableCEO/resume.reachableceo.com/cv/CharlesNWybleCV.pdf

View File

@ -0,0 +1,168 @@
############################################################
# Machine readable CV for the various employment platforms
############################################################
EmploymentPlatforms=(
"glassdoor"
"dice"
"guru"
"indeed"
"linkedin"
"teal"
"upwork"
"ziprecruiter"
)
#Per platform specific notes....
# Original idea here was to use the CSV file (| separated but whatever) and figure out (per platform) what was needed for formatting to be
# auto parsed
# ie
# function linkedin
# COMPANY=$1
# TITLE=$1
# EMPLOYMENTDATE=$1
# $COMPANY $EMPLYMENTDATE $TITLE
# function glassdoor
# COMPANY=$1
# TITLE=$1
# EMPLOYMENTDATE=$1
# $COMPANY $TITLE $EMPLOYMENTDATE
# This may still be developed
# glassdoor
# Appears to not try to parse.
# indeed
# Appears to not try to parse.
# ziprecruiter
# ZipRecruiter (position parsing) (fixed manually, only one position wasn't properly captured)
# linkedin
# TBD, not sure how/if/when it parses the uploaded document...
# upwork
# Doesn't seem to parse the resume at all
# roberthalf
# Robert Half (not sure if it parses resume or not)
# dice
# DIce (skills)
# teal
# tbd
# guru
# tbd
# careerbuilder
# tbd
# oracle talent something something (most big companies appear to use this)
# tbd (once i apply for a job somewhere that uses that platform, i will update)
############################################################
# Machine readable CV for the various employment platforms
############################################################
for platform in "${EmploymentPlatforms[@]}"; do
MachineOutputIntermediateFile="./output/intermediate/machine/$platform/CharlesNWybleCV.md"
echo "Removing old resume for $platform..."
rm "$MachineOutputIntermediateFile"
done
#Per platform specific notes....
# Original idea here was to use the CSV file (| separated but whatever) and figure out (per platform) what was needed for formatting to be
# auto parsed
# ie
# function linkedin
# COMPANY=$1
# TITLE=$1
# EMPLOYMENTDATE=$1
# $COMPANY $EMPLYMENTDATE $TITLE
# function glassdoor
# COMPANY=$1
# TITLE=$1
# EMPLOYMENTDATE=$1
# $COMPANY $TITLE $EMPLOYMENTDATE
# This may still be developed
# glassdoor
# Appears to not try to parse.
# indeed
# Appears to not try to parse.
# ziprecruiter
# ZipRecruiter (position parsing) (fixed manually, only one position wasn't properly captured)
# linkedin
# TBD, not sure how/if/when it parses the uploaded document...
# upwork
# Doesn't seem to parse the resume at all
# roberthalf
# Robert Half (not sure if it parses resume or not)
# dice
# DIce (skills)
# teal
# tbd
# guru
# tbd
# careerbuilder
# tbd
# oracle talent something something (most big companies appear to use this)
# tbd (once i apply for a job somewhere that uses that platform, i will update)
IFS=$'\n\t'
for platform in "${EmploymentPlatforms[@]}"; do
echo "Creating pdf resume for $platform..."
MachineOutputIntermediateFile="./output/intermediate/machine/$platform/CharlesNWybleCV.md"
#Pull in my contact info
cat "../common/@ReachableCEO/Resume/Common/Contact-Info.md" >> "$MachineOutputIntermediateFile"
echo " " >> "$MachineOutputIntermediateFile"
#Pull in my skills
cat "../common/@ReachableCEO/Resume/Common/Skills.md" >> "$MachineOutputIntermediateFile"
echo " " >> "$MachineOutputIntermediateFile"
#And here we do some magic...
#Pull in my employer/title/dates of employment and my long form position summary data from each position
IFS=$'\n\t'
for position in \
$(cat ../common/WorkHistory.md|awk -F ',' '{print $1}'|sed -e 's/**//g'|sed '/##/d'|sed '/^$/d');
do
echo " " >> $MachineOutputIntermediateFile
POSITION_FILE_NAME="$(echo $position | awk -F ',' '{print $1}')"
cat "../cv/@ReachableCEO/Resume/CV/$POSITION_FILE_NAME.md" >> "$MachineOutputIntermediateFile"
echo " " >> "$MachineOutputIntermediateFile"
done
#Pull in my education info
cat "../common/Education.md" >> "$MachineOutputIntermediateFile"
pandoc \
$MachineOutputIntermediateFile \
--template eisvogel \
--from markdown \
--to=pdf \
--output /d/tsys/@ReachableCEO/resume.reachableceo.com/cv/CharlesNWybleCV.pdf
done

View File

@ -0,0 +1,93 @@
Charles N Wyble
=====
Senior (**Staff level**) **System Engineer/SRE/Architect** with extensive Linux/Windows/Networking/Cyber security background and experience
[ [Github Profile](https://github.com/reachableceo) ] . [ [Linkedin Profile](https://www.linkedin.com/in/charles-wyble-412007337) ] . [ reachableceo@reachableceo.com ] . [ 818 280 7059 ] . [ Austin TX / Raleigh NC / Remote ]
## Employment History
**CDK Global** | Senior System Engineer | July 2024 - October 2024
- Security Compliance : worked with risk management/audit to remediate insecure configurations
- Created Alma9 Packer image from scratch
**Apple Computer** | Senior System Administrator | March 2024 - July 2024
- Day to day server operations scheduling downtime etc
**SHEIN.com** | Staff Site Reliability Engineer | December 2022 - August 2023
- Streamlined engineer onboarding by documenting and overhauling the process, consolidating disparate guides into a comprehensive modular set of documents.
- Established a taxonomy for team documentation in the wiki, implementing Confluence best practices for a proper knowledge base.
- Served as the SRE Liaison for cybersecurity functions, ensuring compliance with data locality/partition requirements and pending federal data privacy legislation.
- Focused on fostering a culture of automation and skill development within the SRE team, emphasizing code review, infrastructure as code, versioning, testing, and effective ticket management.
- Contributed to Linux server administration both internally and externally, aiding colleagues with scripting/automation and assisting in the migration from AWS to Azure with zero customer-facing system impact. Additionally, provided day-to-day support for AWS and Azure activities and troubleshooting.
**3M** | Senior Site Reliability Engineer | March 2020 - November 2022
- Supported Vendavo on RedHat Linux, managed releases, and provided day-to-day developer support.
- Created a homegrown YAML conflguration management system, utilizing bash scripting and YAML templates with a CSV-based key/value store to eThciently manage and regenerate environment-speciflc variables for a line-of-business application across multiple development, testing, staging, and production environments.
**TippingPoint** | Staff System/Network Architect | March 2012 - June 2019
- Conducted code and design reviews for internal/external team projects and actively participated in broad enterprise collaboration, focusing on large-scale fleet management.
- Managed user account administration, manual/semi-automated server provisioning, trouble tickets, security vulnerability remediation, and system/network auditing.
- Led various projects, including migrating fleet systems from Centos 6 to Centos 7, implementing LXC/ LXD container versions for increased system utilization, and creating an on-premise deployment system (GUMPS) for automated provisioning.
- Deployed network monitoring systems (Zenoss, observium/librenms), utilized librenms as a Conflguration Management Database (CMDB), and implemented a fleet orchestration system (Rundeck).
- Executed extensive vulnerability remediation, OS/application/kernel patching, NIC customization/ optimization, and data migrations while developing and maintaining custom scripts for tasks such as LDAP management and SSL scenarios. Automated processes like re-imaging and ensured continuous distribution of a 40GB dataset of packet captures across a global fleet.
**HostGator.com** | Automation and Escalation Engineer | March 2011 - May 2012
- Provided senior-level Linux and web application support globally.
- Developed standardized reply language and scripts, reducing errors in Level I Linux administrator department.
**RippleTV** | Systems Engineer | October 2008 - Jaunary 2010
- Provided system engineering expertise for customer-facing advertising platform (AdSpot) and internal fleet management tool (CPanel).
- Utilized Nginx, Mongrel, Thin, Rails, Merb, Rack, MySQL, memcached, and F5 LTM.
**Walt Disney Internet Group** | Site Reliability Engineer | August 2006 - September 2007
- Engaged in dynamic work environment focusing on popular web/ecommerce sites, including disneyworld.com and disneyland.com.
- Provided design, architecture, and day-to-day administration for Disney park property sites generating $2 billion annually.
- Offered system engineering and PCI compliance expertise for Disneyland.com, Disneyworld.com, DisneyCruiseLines.com, and 23 related properties.
- Utilized Jboss instances on RHEL3/4 for business logic and employed Windows 2003 with Tomcat/IIS for frontend application serving.
- Automated routine system administration tasks through the creation of batch and VBScript programs for Windows administration.
- Led an Active Directory project for WDIG, designing and implementing a nationwide, highly available system across 3 data centers.
- Managed the migration from Windows NT to Windows 2003 Active Directory domain controllers, including experience with Windows 2008, Centrify, and Samba/Winbind/LDAP/Kerberos.
**Electronic Clearing House** | Senior Linux System Administrator | April 2005 - July 2006
- Linux systems engineer in a 24x7 transaction processing/ecommerce/flnancial services environment, collaborating with network administration and infrastructure design teams.
- Ensured continuous uptime for high-impact environments, including a 1TB MySQL database, 300TB Oracle database, 1.5TB Oracle Data Warehouse, and a 4,000-store LAMP-based ecommerce system (MerchantAmerica.com).
- Successfully deployed an enterprise-wide Linux backup system, featuring encrypted backups stored on a central server with ISCSI attached network storage, utilizing GNUPG and tar over SSH. Regular backups and restores were tested weekly.
- Led the deployment of Oracle database infrastructure, implementing two Oracle RAC clusters with Dell 6850 servers, Qual Dual Core Xeon processors, and 32GB of RAM each. The clusters ran on RedHat Enterprise Linux 4.0 64bit edition, serving Data Warehouse, Transaction Processing Software, and Credit Card Clearing applications.
**GSI Commerce** | HPUX/Windows/Linux System Administrator | March 2002 - February 2005
- Contributed to disaster recovery from an Informix Database failure.
- Streamlined FTP server conflguration and deployed open-source remote control software.
- Upgraded network infrastructure from hubs to managed switches.
**ReachableCEO Enterprises** | Freelancer | January 2001 - December 2024
- Deployed,configured,supported Cloudron and Coolify PAAS and a full IT/SRE/Devops and engineering software stack for a stealth aerospace startup.
- Provided technician support to a team of electrical engineers building the power system for the radar of FrakenSAM in Ukraine. Handled high / low voltage wiring and plumbing and documentation of those systems.
- Provided root cause analysis and remediation for a security breach at a defense contractor.
- Developed a rapidly field deployable mesh networking system for a variety of use cases.
- Developed a secure global video conferencing system using only 3mbps for a major defense contractor.
- Advised on backend infrastructure for broadcasting news and information via radio and internet into hostile powers.

View File

@ -0,0 +1,55 @@
Charles N Wyble
=====
Senior (Staff level) System Engineer/SRE/Architect with extensive Linux/Windows/Networking/Cyber security background and experience
[ [Github Profile](https://github.com/reachableceo) ] . [ [Linkedin Profile](https://www.linkedin.com/in/charles-wyble-412007337) ] . [ reachableceo@reachableceo.com ] . [ 818 280 7059 ] . [ Austin TX / Raleigh NC / Remote ]
## Skills
- **Linux** (22 years) : RHEL/Debian/Ubuntu, kickstart, PXE, LDAP, SSSD, RPM/Deb package creation, quotas, extended permissions, clustering, AppArmor, SeLinux, Centrify, Tripwire, Integrit, OSSEC.
- **Unix** ( 5 years) : HPUX/Solaris
- **Windows** (22 years) : Server (2008 - 2016), Windows client automated deployment (7,8,10,11), Active Directory, Group Policy, WSUS, Certificate Services, AD DNS, AD DHCP, complex multiple forest and domain setups, LDAP
- **Free/Libre/Open Source Server software** (22 years) : Apache, Postfix, Qmail, Dovecot, Courier, Nginx, Matamo, Discourse, Wordpress, Mautic,Dolibarr, Revive, Firefly, Cloudron, Coolify, Gitea, Gitlab, GitHub, Git, Jenkins, Rundeck, N8N, Mysql,Postgresql, LetsEncrypt, ACME, cfssl
- **Cyber Security** (22 years) : PCI Compliance, security hardening, audits, breach response and mitigation, patch and vulnerability management.
- **Networking** (22 years) : Linux Virtual Server, HAProxy, Ubiquiti Unifi, Opnsense, Pfsense, HP, Cisco, Arista, Dell, DNS, DHCP, IPAM, PXE, IPS, IDS, GRE, IPSEC. Wireguard, OpenVPN, Nebula, Tailscale, RADIUS. Mostly layer2 data center/campus/access some WAN,firewall,layer3
- **Monitoring** (22 years) : Uptime kuma, librenms, zabbix, zenoss, nagios
- **Storage** (22 years) : NFS, Samba, CIFS, Netapp, ZFS, True/Free NAS, 3par, MSA, Equallogic, EMC, generic iscsi
- **Virtualization** (22 years) : VmWare, Parallels,HyperV, KVM, Xen
- **Containerization** (12 years) : LXC, Docker
- **Configuration management** (22 years) : Slack, Cfengine, Puppet, FetchApply, Ansible, Hashicorp Packer/Vault
- **Embedded** (5 years) : Raspberry pi, arduino, seeduino, Lego Mindstorms
- **Programming/Automation** (22 years) : Bash , J2ME, PHP, Ruby, PowerShell, TCL/TK, Java.
- **Ticket / incident / project management** (22 years): Jira, ServiceNow,Redmine,RT.
- **Git** (15 years) : branching, merging, multiple teams, external vendors, submodules etc.
- **LLM** (2 years) : OpenWebUI, Apple Silicon, QA/validation, RAG, data cleaning/prep etc.
- **Current growth/learning focus** : Prometheus, Grafana, CI/CD, GCP, AWS, Azure, Kubernetes, Helm. Also Saylor.org MBA program.
## Education
High School Graduate

View File

@ -1,121 +0,0 @@
# {{CandidateName}} Candidate Information Sheet
## Introduction
Hello,
I apologize for the form letter response.
I receive a high volume of recruiter emails every day and I've found this letter to be the most efficient way to
handle the high volume of emails and reduce back and forth emails/texts/calls.
If you have any questions/comments/concerns not covered by this document, please let me know via e-mail and I'm happy to address them!
If you ask me something answered in this document, I will not respond and will not move forward with the opportunity, so please read it in detail!
## Re: share my ID over email
I WILL NOT share my (full or redacted) photo ID over email or any other electronic written
communication. If that is "required" then I have no interest in moving forward with this opportunity.
I am happy to get on a teams/zoom/google meet etc call and show my ID.
## Re: professional references
I am happy to provide professional references once an interview with the end client/customer/hiring manager/team has been scheduled. I will NOT provide references up front. If that is "required" then I have no interest in moving forward with this opportunity.
## Re: relocation
if the role is not based in **{{CandidateLocation}}** or **Other Location** I will need to re-locate
| Question | Answer |
|-------------------------------------------|--------|
| Am I open to relocation? | Yes |
| Am I willing to re-locate at own expense? | No |
| Am I open to up to 100% travel | Yes |
Please be aware that:
- I will **only re-locate at the employer expense**.
- I will need **two weeks of time** to re-locate.
- The net amount of the re-location benefit **MUST be at least {{CandidateRelocationNetMinimumAmount}}** to fully compensate me for the time/effort to re-locate.
- The full re-location benefit **must be provided prior to the confirmed start date**.
- I **will NOT** accept a reimbursement based re-location package.
- I am happy to come onsite (at client expense (paid up front)) for training/orientation etc.
\pagebreak
## Rate Schedule (compensation expectations)
### Fully remote roles
I have a **very strong** preference for fully remote roles.
I am open to (at the absolute bottom of my range):
- **{{CandidateRateSheetRemoteW2HourlyMinimum}}** per hour(w2)
- **{{CandidateRateSheetRemoteW2AnnualMinimum}}** annually
- **{{CandidateRateSheetRemote1099HourlyMinimum}}** per hour (1099/corp to corp)
I have a strong preference for roles that are :
- **{{CandidateRateSheetRemoteW2HourlyPrefer}}** per hour(w2) or more
- **{{CandidateRateSheetRemoteW2AnnualPrefer}}** annually or more
- **{{CandidateRateSheetRemote1099HourlyPrefer}}** per hour (1099/corp to corp) or more
### On-site/hybrid roles
- **{{CandidateRateSheetRemoteW2HourlyPrefer}}** per hour(w2) or more
- **{{CandidateRateSheetRemoteW2AnnualPrefer}}** annually or more
- **{{CandidateRateSheetRemote1099HourlyPrefer}}** per hour (1099/corp to corp) or more
In regards to compensation type, I am open to:
- w2
- corp to corp (I have my own LLC)
- 1099
If you have a rate for any of the compensation options above, send them all. I will pick which one works best for my situation and the opportunity.
If it's a different rate with/without benefits, send both.
If the above is in alignment with this opportunity, please feel free to send me an RTR with the best rate you can offer.
\pagebreak
## Details needed for submission
### My resume
[Download Candidate resume(format)](https://some.resume.somewhere/some-Resume.pdf)
I am happy to discuss and make edits to the resume content specific to the opportunity if you feel they are needed.
### Candidate details
Here are my complete candidate details for submission to the role.
| Question | Answer |
|---------------------------------------|-------------------------------------|
| Full name | {{CandidateName}} |
| E-mail address | {{CandiateEmail}} |
| Phone number | {{CandidatePhone}} |
| Preferred form of contact | {{CandidatePreferredContactMethod}} |
| Work authorization | {{CandidateWorkAuthorization}} |
| Are you employed presently? | {{CandidateEmploymentStatus}} |
| Current location | {{CandidateCurrentLocation}} |
| Current timezone | {{CandidateCurrentTimezone}} |
| Timezones I can work in | {{CandidateWorkableTimezones}} |
| Availability to interview | {{CandidateInterviewAvailability}} |
| Availability to start | {{CandidateStartAvailability}} |
| Highest Education | {{CandidateHighestEducation}} |
| Graduated Year | {{CandidateGraduationYear}} |
| Name of school | {{CandidateSchoolName}} |
| Location of school | {{CandidateSchoolLocation}} |
| Linkedin Profile | ({{CandidateLinkedin}}) |
| Github Profile | ({{CandidateGithub}}) |
| Last project | {{CandidateLastProject}} |
| DOB | {{CandidateDOB}} |
| Total IT/career experience | {{CandidateTotalExperience}} |
| Open to in-office/hybrid/remote | Yes |
| Any trips planned in next six months? | No |

View File

@ -1,4 +0,0 @@
{{CandidateName}}
=====
{{CandidateOneLinerSummary}}

View File

@ -1,6 +0,0 @@
{{CandidateName}}
=====
{{CandidateOneLinerSummary}}
[ [Github Profile]({{CandidateGithub}}) ] . [ [Linkedin Profile]({{CandidateLinkedin}}) ] . [ {{CandidateEmail}} ] . [ {{CandidatePhone}} ] . [ {{CandidateLocation}} ]

View File

@ -1,4 +0,0 @@
- Supported Vendavo on RedHat Linux, managed releases, and provided day-to-day developer support.
- Created a homegrown YAML configuration management system, utilizing bash scripting and YAML templates with a CSV-based key/value store to efficiently manage and regenerate environment-specific variables for a line-of-business application across multiple development, testing, staging, and production environments.
\pagebreak

View File

@ -1 +0,0 @@
- Day to day server operations scheduling downtime etc

View File

@ -1,9 +0,0 @@
- Engaged in dynamic work environment focusing on popular web/ecommerce sites, including disneyworld.com and disneyland.com.
- Provided design, architecture, and day-to-day administration for Disney park property sites generating $2 billion annually.
- Provided system engineering and PCI compliance expertise for Disneyland.com, Disneyworld.com, DisneyCruiseLines.com, and 23 related properties.
- Utilized Jboss instances on RHEL3/4 for business logic and employed Windows 2003 with Tomcat/IIS for frontend application serving.
- Automated routine system administration tasks through the creation of batch and VBScript programs for Windows administration.
- Led an Active Directory project for WDIG, designing and implementing a nationwide, highly available system across 3 data centers.
- Managed the migration from Windows NT to Windows 2003 Active Directory domain controllers, including experience with Windows 2008, Centrify, and Samba/Winbind/LDAP/Kerberos.
\pagebreak

View File

@ -1,20 +0,0 @@
Linux|22 years|RHEL,Debian,Ubuntu,kickstart,PXE, LDAP,SSSD,RPM/Deb package creation, quotas,extended permissions, clustering,NFS,Samba
Unix|5 years|HPUX/Solaris
Windows|22 years|Server (2008 2016),Windows client automated deployment (7,8,10,11),Active Directory,Group Policy,WSUS,Certificate Services,AD DNS,AD DHCP,complex multiple forest and domain setups
Free/Libre/Open Source software|22 years|Apache,Postfix,Qmail,Dovecot,Courier IMAP,Nginx,Matamo,Discourse,Wordpress, Mautic,Dolibarr,Revive Ad Server,Firefly,Cloudron,Coolify,Gitea, HomeAssistant, Jenkins,Rundeck,N8N, LetsEncrypt,ACME,cfssl
Databases|22 years| MySQL,PostgreSQL, Dbeaver,PHPMyAdmin,PostGIS
Cyber Security|22 years|PCI Compliance (tier 1 implementations),OpenVAS<,Lynis,security hardening,audits,breach response and mitigation, patch and vulnerability management. AppArmor, SeLinux, Centrify, Tripwire, Integrit, OSSEC
Virtualization|22 years|VmWare,Parallels,HyperV,KVM,Xen
Networking|22 years|Linux Virtual Server(LVS),HAProxy,Ubiquiti Unifi,Opnsense,Pfsense,DNS,DHCP,IPAM,PXE,IPS,IDS,GRE,IPSEC.Wireguard,OpenVPN,Nebula,Tailscale,RADIUS. Mostly layer2 data center/campus/access some WAN,firewall,layer3
Monitoring|22 years|Uptime Kuma,Librenms,Zabbix,Zenoss,Nagios,Elasticsearch,Logstash,Kibana(ELK)
Storage|22 years|Netapp,EMC,EqualLogic,3par,MSA,TrueNAS/ZFS,iscsi,S3,Azure Storage
Cloud|5 years|AWS,Azure,Kubernetes,Helm,Docker
Containerization|15 years|LXC,Docker,OpenVZ
Configuration management/InfrastructureAsCode(IAC)|22 years|FetchApply,Terraform/OpenTOfU,Ansible,AWX,Hashicorp Packer/Vault
Ticket / incident / project management| 22 years| Jira,ServiceNow,Redmine,RT
Git|15 years|Branching,merging,multiple teams,external vendors,submodules
SRE|5 years| Grafana,Prometheus,Signoz,Wazuh
LLM|2 years|OpenWebUI,QA/validation,RAG,data cleaning/prep
Programming|5 years|J2ME,PHP,Ruby,TCL/TK,Java,C,C++
Automation|22 years|Bash,YAML,TOML,PowerShell,Perl
Embedded development|5 years|Raspberry pi,arduino,seeduino,Lego Mindstorms
1 Linux 22 years RHEL,Debian,Ubuntu,kickstart,PXE, LDAP,SSSD,RPM/Deb package creation, quotas,extended permissions, clustering,NFS,Samba
2 Unix 5 years HPUX/Solaris
3 Windows 22 years Server (2008 2016),Windows client automated deployment (7,8,10,11),Active Directory,Group Policy,WSUS,Certificate Services,AD DNS,AD DHCP,complex multiple forest and domain setups
4 Free/Libre/Open Source software 22 years Apache,Postfix,Qmail,Dovecot,Courier IMAP,Nginx,Matamo,Discourse,Wordpress, Mautic,Dolibarr,Revive Ad Server,Firefly,Cloudron,Coolify,Gitea, HomeAssistant, Jenkins,Rundeck,N8N, LetsEncrypt,ACME,cfssl
5 Databases 22 years MySQL,PostgreSQL, Dbeaver,PHPMyAdmin,PostGIS
6 Cyber Security 22 years PCI Compliance (tier 1 implementations),OpenVAS<,Lynis,security hardening,audits,breach response and mitigation, patch and vulnerability management. AppArmor, SeLinux, Centrify, Tripwire, Integrit, OSSEC
7 Virtualization 22 years VmWare,Parallels,HyperV,KVM,Xen
8 Networking 22 years Linux Virtual Server(LVS),HAProxy,Ubiquiti Unifi,Opnsense,Pfsense,DNS,DHCP,IPAM,PXE,IPS,IDS,GRE,IPSEC.Wireguard,OpenVPN,Nebula,Tailscale,RADIUS. Mostly layer2 data center/campus/access some WAN,firewall,layer3
9 Monitoring 22 years Uptime Kuma,Librenms,Zabbix,Zenoss,Nagios,Elasticsearch,Logstash,Kibana(ELK)
10 Storage 22 years Netapp,EMC,EqualLogic,3par,MSA,TrueNAS/ZFS,iscsi,S3,Azure Storage
11 Cloud 5 years AWS,Azure,Kubernetes,Helm,Docker
12 Containerization 15 years LXC,Docker,OpenVZ
13 Configuration management/InfrastructureAsCode(IAC) 22 years FetchApply,Terraform/OpenTOfU,Ansible,AWX,Hashicorp Packer/Vault
14 Ticket / incident / project management 22 years Jira,ServiceNow,Redmine,RT
15 Git 15 years Branching,merging,multiple teams,external vendors,submodules
16 SRE 5 years Grafana,Prometheus,Signoz,Wazuh
17 LLM 2 years OpenWebUI,QA/validation,RAG,data cleaning/prep
18 Programming 5 years J2ME,PHP,Ruby,TCL/TK,Java,C,C++
19 Automation 22 years Bash,YAML,TOML,PowerShell,Perl
20 Embedded development 5 years Raspberry pi,arduino,seeduino,Lego Mindstorms

View File

@ -1,11 +0,0 @@
CDK Global,Senior System Engineer,July 2024 - October 2024
Apple Computer,Senior System Administrator,March 2024 - July 2024
SHEIN,Staff Site Reliability Engineer,December 2022 - August 2023
3M,Senior Site Reliability Engineer,March 2020 - November 2022
TippingPoint,Staff System Architect,March 2012 - June 2019
HostGator.com,Automation and Escalation Engineer,March 2011 - May 2012
RippleTV,System Engineer,October 2008 - January 2010
Walt Disney Internet Group,Site Reliability Engineer,August 2006 - September 2007
Electronic Clearing House,Senior System Administrator,April 2005 - July 2006
GSI Commerce,System Administrator,March 2002 - February 2005
ReachableCEO Enterprises,Freelancer,January 2001 - December 2024
1 CDK Global Senior System Engineer July 2024 - October 2024
2 Apple Computer Senior System Administrator March 2024 - July 2024
3 SHEIN Staff Site Reliability Engineer December 2022 - August 2023
4 3M Senior Site Reliability Engineer March 2020 - November 2022
5 TippingPoint Staff System Architect March 2012 - June 2019
6 HostGator.com Automation and Escalation Engineer March 2011 - May 2012
7 RippleTV System Engineer October 2008 - January 2010
8 Walt Disney Internet Group Site Reliability Engineer August 2006 - September 2007
9 Electronic Clearing House Senior System Administrator April 2005 - July 2006
10 GSI Commerce System Administrator March 2002 - February 2005
11 ReachableCEO Enterprises Freelancer January 2001 - December 2024

View File

@ -1,14 +0,0 @@
title: "{{CandidateName}} Candidate Information Sheet"
titlepage: true
titlepage-logo: "{{CandidateLogo}}"
toc: true
toc-own-page: true
date: \today
header-left: "\\hspace{1cm}"
header-center: "\\leftmark"
header-right: "Page \\thepage"
footer-left: "{{CandidateName}}"
footer-center: "{{CandidateTagline}}"
footer-right: "[Source code]({{SourceCode}})"
urlcolor: {{URLCOLOR}}
page-background: "{{PAGEBACKGROUND}}"

View File

@ -1,7 +0,0 @@
title: "{{CandidateName}} Resume"
header-left: "\\hspace{1cm}"
header-center: "\\leftmark"
header-right: "Page \\thepage"
footer-left: "{{CandidateName}}"
urlcolor: {{URLCOLOR}}
page-background: "{{PAGEBACKGROUND}}"

View File

@ -1,12 +0,0 @@
title: "{{CandidateName}} Resume"
titlepage: true
titlepage-logo: "{{CandidateLogo}}"
date: \today
header-left: "\\hspace{1cm}"
header-center: "\\leftmark"
header-right: "Page \\thepage"
footer-left: "{{CandidateName}}"
footer-center: "{{CandidateTagline}}"
footer-right: "[Source code]({{SourceCode}})"
urlcolor: {{URLCOLOR}}
page-background: "{{PAGEBACKGROUND}}"

View File

@ -1,102 +0,0 @@
#############################################################################
#SET THIS OR NOTHING WILL WORK
export PipelineClientWorkingDir="D:/tsys/ReachableCEOPublic/MarketingMaterials/backend/ReachableCEO-Profile-FTE/local"
#SET THIS OR NOTHING WILL WORK
#############################################################################
###################################################
# Modify these values to suit
###################################################
########################
# Contact info
########################
export CandidateName="Charles N Wyble"
export CandidatePhone="1 818 280 7059"
export CandidateLocation="Austin TX / Raleigh NC / Remote "
export CandidateEmail="reachableceo@reachableceo.com"
########################
# Profile information
########################
export CandidateOneLineSummary="Senior (**Staff level**) **System Engineer/SRE/Architect** with extensive Linux/Windows/Networking/Cyber security background and experience
"
export CandidateLinkedin="https://www.linkedin.com/in/charles-wyble-412007337"
export CandidateGithub="https://www.github.com/reachableceo"
export CandidateTagline="Tenacity. Velocity. Focus."
########################
# Formatting options
########################
export CandidateLogo="D:/tsys/ReachableCEOPublic/ReachableCEO.png"
export SourceCode="https://git.knownelement.com/reachableceo/ReachableCEO-Profile-FullTimeEmployment"
export URLCOLOR="blue"
export PAGEBACKGROUND="$PipelineClientWorkingDir/build/background5.pdf"
export PANDOC_TEMPLATE="eisvogel"
###########################
# Yaml files used by pandoc
###########################
export YamlInputTemplateFileCandidateInfoSheet="$PipelineClientWorkingDir/build/BuildTemplate-CandidateInfoSheet.yml"
export YamlInputTemplateFileJobBoard="$PipelineClientWorkingDir/build/BuildTemplate-JobBoard.yml"
export YamlInputTemplateFileClientSubmission="$PipelineClientWorkingDir/build/BuildTemplate-ClientSubmission.yml"
export WordOutputReferenceDoc="$PipelineClientWorkingDir/build/resume-docx-reference.docx"
##########################
# Candidate info sheet
##########################
export CandidatePreferredContactMethod="Email will get the fastest response."
export CandidateWorkAuthorization="US Citizen"
export CandidateEmploymentStatus="Not currently employed"
export CandidateCurrentLocation="Austin, TX"
export CandidateCurrentTimezone="CST"
export CandidateWorkableTimezones="PST/CST/EST"
export CandidateInterviewAvailability="Immediate"
export CandidateStartAvailability="Two weeks"
export CandidateHighestEducation="High School"
export CandidateGraduationYear="2002"
export CandidateSchoolName="Osborne Christian School"
export CandidateSchoolLocation="Los Angeles, CA"
export CandidateLastProject="CDK Global October 2024"
export CandidateDOB="09/14"
export CandidateTotalExperience="22 years"
########################
#Compensation targets
########################
export CandidateRelocationNetMinimumAmount="\$5000.00"
export CandidateRateSheetRemoteW2HourlyMinimum="\$60.00"
export CandidateRateSheetRemoteW2AnnualMinimum="\$120,000.00"
export CandidateRateSheetRemote1099HourlyMinimum="\$75.00"
export CandidateRateSheetRemoteW2HourlyPrefer="\$70.00"
export CandidateRateSheetRemoteW2AnnualPrefer="\$140,000.00"
export CandidateRateSheetRemote1099HourlyPrefer="\$85.00"
###############################
# Output location/filenames
###############################
RESUME_FILE_NAME="$(echo $CandidateName|sed 's/ //g')"
export BUILD_OUTPUT_DIR="D:/tsys/ReachableCEOPublic/MarketingMaterials/websites/profile-fte.reachableceo.com"
export CandidateInfoSheetMarkdownOutputFile="$BUILD_OUTPUT_DIR/recruiter/$RESUME_FILE_NAME-InfoSheet.md"
export CandidateInfoSheetPDFOutputFile="$BUILD_OUTPUT_DIR/recruiter/$RESUME_FILE_NAME-InfoSheet.pdf"
export JobBoardMarkdownOutputFile="$BUILD_OUTPUT_DIR/job-board/$RESUME_FILE_NAME-Resume.md"
export JobBoardPDFOutputFile="$BUILD_OUTPUT_DIR/job-board/$RESUME_FILE_NAME-Resume.pdf"
export JobBoardMSWordOutputFile="$BUILD_OUTPUT_DIR/job-board/$RESUME_FILE_NAME-Resume.doc"
export ClientSubmissionMarkdownOutputFile="$BUILD_OUTPUT_DIR/client-submit/$RESUME_FILE_NAME-Resume.md"
export ClientSubmissionPDFOutputFile="$BUILD_OUTPUT_DIR/client-submit/$RESUME_FILE_NAME-Resume.pdf"
export ClientSubmissionMSWordOutputFile="$BUILD_OUTPUT_DIR/client-submit/$RESUME_FILE_NAME-Resume.doc"

View File

@ -1,70 +0,0 @@
%PDF-1.5
%µí®û
3 0 obj
<< /Length 4 0 R
/Filter /FlateDecode
>>
stream
xœmŽ;Â@ Dû=…/€±½þí18 „ûKì¢D
ráÑÈóƯÂ0f™á|%˜ß… •1ÓÁš¡DÀég,7˜
!iÂjÒ”»P µ/c<>h¶pxr‰<C382>ˆá±1Wã^Ž™Qd£%qó!º9ZDj¿¥ UÅѪíÙûËAµp<C2B5>ñr$Õ&]x_ìƒÊȽŸI±WÿÃ?f¦r)Ì6EU
endstream
endobj
4 0 obj
168
endobj
2 0 obj
<<
/ExtGState <<
/a0 << /CA 1 /ca 1 >>
>>
>>
endobj
5 0 obj
<< /Type /Page
/Parent 1 0 R
/MediaBox [ 0 0 595.275574 841.889771 ]
/Contents 3 0 R
/Group <<
/Type /Group
/S /Transparency
/I true
/CS /DeviceRGB
>>
/Resources 2 0 R
>>
endobj
1 0 obj
<< /Type /Pages
/Kids [ 5 0 R ]
/Count 1
>>
endobj
6 0 obj
<< /Creator (cairo 1.14.8 (http://cairographics.org))
/Producer (cairo 1.14.8 (http://cairographics.org))
>>
endobj
7 0 obj
<< /Type /Catalog
/Pages 1 0 R
>>
endobj
xref
0 8
0000000000 65535 f
0000000582 00000 n
0000000282 00000 n
0000000015 00000 n
0000000260 00000 n
0000000354 00000 n
0000000647 00000 n
0000000774 00000 n
trailer
<< /Size 8
/Root 7 0 R
/Info 6 0 R
>>
startxref
826
%%EOF

View File

@ -1,35 +0,0 @@
#!/bin/bash
# A client script to setup variables for and execute:
#../vendor/git.knownelement.com/reachableceo/MarkdownResume-Pipeline/build/build-pipeline-server.sh
source ./CandidateVariables.env
####################################################
#DO NOT CHANGE ANYTHING BELOW THIS LINE
####################################################
##################################################################
# Setup globals for use by the build-pipeline-server.sh script
##################################################################
export MO_PATH="bash ../../vendor/git.knownelement.com/ExternalVendorCode/mo/mo"
export BUILD_TEMP_DIR="$PipelineClientWorkingDir/build-temp/MarkdownResume"
export BUILDYAML_JOBBOARD="$BUILD_TEMP_DIR/JobBoard.yml"
export BUILDYAML_CLIENTSUBMISSION="$BUILD_TEMP_DIR/ClientSubmission.yml"
export BUILDYAML_CANDIDATEINFOSHEET="$BUILD_TEMP_DIR/CandidateInfoSheet.yml"
# Cleanup previous intermediatge and final output artifacts
rm $BUILD_TEMP_DIR/*.yml
rm $BUILD_TEMP_DIR/*.md
rm $BUILD_OUTPUT_DIR/client-submit/*
rm $BUILD_OUTPUT_DIR/job-board/*
rm $BUILD_OUTPUT_DIR/recruiter/*
# Call the build-pipeline-server in the vendored repository to produce updated output artifacts
bash ../../vendor/git.knownelement.com/reachableceo/MarkdownResume-Pipeline/build/build-pipeline-server-markdown.sh

View File

@ -0,0 +1,12 @@
title: "Charles N Wyble Resume"
titlepage: true
titlepage-logo: "D:/tsys/@ReachableCEO/ReachableCEO.png"
date: \today
header-left: "\\hspace{1cm}"
header-center: "\\leftmark"
header-right: "Page \\thepage"
footer-left: "Charles N Wyble"
footer-center: "Tenacity. Velocity. Focus."
footer-right: "[Source code for this resume](https://git.knownelement.com/reachableceo/ReachableCEOResume) "
urlcolor: blue
page-background: "D:/tsys/@ReachableCEO/ExternalVendorCode/pandoc-latex-template/examples/page-background/backgrounds/background5.pdf"

View File

@ -2,10 +2,7 @@
- Developed and implemented an internal private cloud orchestration and provisioning system for a hardware development engineering team that handled the entire provisioning lifecycle for physical and virtual systems. - Developed and implemented an internal private cloud orchestration and provisioning system for a hardware development engineering team that handled the entire provisioning lifecycle for physical and virtual systems.
- Developed and implemented standardized language and procedures and incident investigation automation for a large technical support organization with high turnover. - Developed and implemented standardized language and procedures and incident investigation automation for a large technical support organization with high turnover.
- Developed and implemented an automated order status and payment handling interactive voice response application using Angel.ccm with a backend web service returning Voice XML. This allows call center personnel to focus on revenue generating opportunities instead of administrative matters. - Developed and implemented an automated order status and payment handling interactive voice response application using Angel.ccm with a backend web service returning Voice XML. This allows call center personnel to focus on revenue generating opportunities instead of administrative matters.
- Provided technician support to a team of electrical engineers building the power system for the radar of FrankenSAM in Ukraine. Handled high / low voltage wiring and plumbing and documentation of those systems. - Provided technician support to a team of electrical engineers building the power system for the radar of FrakenSAM in Ukraine. Handled high / low voltage wiring and plumbing and documentation of those systems.
- Provided root cause analysis , mitigation and remediation of security breaches by advanced persistent threat actors at high value targets. - Provided root cause analysis , mitigation and remediation of security breaches by advanced persistent threat actors at high value targets.
- Project managed a successful brand new data center build from bare dirt to serving content in 86 days. Oversaw 8 billion dollars of capital deployment. - Project managed a successful brand new data center build from bare dirt to serving content in 86 days. Oversaw 8 billion dollars of capital deployment.
- Led and consulted tier 1 payment compliance industry (PCI) implementations for some of the worlds largest brands (including at a payment processor). - Led and consulted tier 1 payment compliance industry (PCI) implementations for some of the worlds largest brands (including at a payment processor).
- Rolled out centralized Active Directory authentication, deployed Dell OpenManage, and upgraded network equipment.
- Deployed password vault, Active Directory PKI, and implemented a ground-up network redesign.
- Designed VmWare NSX network.

19
non-cv/Skills.csv Normal file
View File

@ -0,0 +1,19 @@
Linux|22 years|RHEL,Debian,Ubuntu,kickstart,PXE, LDAP,SSSD,RPM/Deb package creation, quotas,extended permissions, clustering,NFS,Samba
Unix|5 years|HPUX/Solaris
Windows|22 years|Server (2008 2016),Windows client automated deployment (7,8,10,11),Active Directory,Group Policy,WSUS,Certificate Services,AD DNS,AD DHCP,complex multiple forest and domain setups
Free/Libre/Open Source software|22 years|Apache,Postfix,Qmail,Dovecot,Courier IMAP,Nginx,Matamo,Discourse,Wordpress, Mautic,Dolibarr,Revive Ad Server,Firefly,Cloudron,Coolify,Gitea, HomeAssistant, Jenkins,Rundeck,N8N, LetsEncrypt,ACME,cfssl
Databases|22 years| MySQL,PostgreSQL, Dbeaver,PHPMyAdmin,PostGIS
Cyber Security|22 years|PCI Compliance (tier 1 implementations),OpenVAS<, Lynis, security hardening, audits, breach response and mitigation, patch and vulnerability management. AppArmor, SeLinux, Centrify, Tripwire, Integrit, OSSEC
Networking|22 years|Linux Virtual Server, HAProxy, Ubiquiti Unifi, Opnsense, Pfsense, HP, Cisco, Arista, Dell, DNS, DHCP, IPAM, PXE, IPS, IDS, GRE, IPSEC. Wireguard, OpenVPN, Nebula, Tailscale, RADIUS. Mostly layer2 data center/campus/access some WAN,firewall,layer3
Monitoring|22 years|Uptime Kuma, Librenms, Zabbix, Zenoss, Nagios, ELK (Elasticsearch, Logstash, Kibana)
Virtualization|22 years|VmWare, Parallels,HyperV, KVM, Xen
Cloud|5 years|AWS, Azure, GCP, Kubernetes
Containerization|15 years|LXC, Docker,OpenVZ
Configuration management/InfrastructureAsCode(IAC)|22 years|FetchApply, Terraform/OpenTOfU,Ansible, AWX,Hashicorp Packer/Vault
Ticket / incident / project management| 22 years| Jira, ServiceNow,Redmine,RT
Git|15 years|Branching, merging, multiple teams, external vendors, submodules
SRE|4 years| Grafana, Prometheus, Signoz, Wazuh
LLM|2 years|OpenWebUI, Apple Silicon, QA/validation, RAG, data cleaning/prep
Programming|5 years|J2ME,PHP,Ruby,TCL/TK,Java,C,C++
Automation|22 years|Bash,YAML,TOML,PowerShell,Perl
Embedded development|5 years|Raspberry pi, arduino, seeduino, Lego Mindstorms
1 Linux 22 years RHEL,Debian,Ubuntu,kickstart,PXE, LDAP,SSSD,RPM/Deb package creation, quotas,extended permissions, clustering,NFS,Samba
2 Unix 5 years HPUX/Solaris
3 Windows 22 years Server (2008 2016),Windows client automated deployment (7,8,10,11),Active Directory,Group Policy,WSUS,Certificate Services,AD DNS,AD DHCP,complex multiple forest and domain setups
4 Free/Libre/Open Source software 22 years Apache,Postfix,Qmail,Dovecot,Courier IMAP,Nginx,Matamo,Discourse,Wordpress, Mautic,Dolibarr,Revive Ad Server,Firefly,Cloudron,Coolify,Gitea, HomeAssistant, Jenkins,Rundeck,N8N, LetsEncrypt,ACME,cfssl
5 Databases 22 years MySQL,PostgreSQL, Dbeaver,PHPMyAdmin,PostGIS
6 Cyber Security 22 years PCI Compliance (tier 1 implementations),OpenVAS<, Lynis, security hardening, audits, breach response and mitigation, patch and vulnerability management. AppArmor, SeLinux, Centrify, Tripwire, Integrit, OSSEC
7 Networking 22 years Linux Virtual Server, HAProxy, Ubiquiti Unifi, Opnsense, Pfsense, HP, Cisco, Arista, Dell, DNS, DHCP, IPAM, PXE, IPS, IDS, GRE, IPSEC. Wireguard, OpenVPN, Nebula, Tailscale, RADIUS. Mostly layer2 data center/campus/access some WAN,firewall,layer3
8 Monitoring 22 years Uptime Kuma, Librenms, Zabbix, Zenoss, Nagios, ELK (Elasticsearch, Logstash, Kibana)
9 Virtualization 22 years VmWare, Parallels,HyperV, KVM, Xen
10 Cloud 5 years AWS, Azure, GCP, Kubernetes
11 Containerization 15 years LXC, Docker,OpenVZ
12 Configuration management/InfrastructureAsCode(IAC) 22 years FetchApply, Terraform/OpenTOfU,Ansible, AWX,Hashicorp Packer/Vault
13 Ticket / incident / project management 22 years Jira, ServiceNow,Redmine,RT
14 Git 15 years Branching, merging, multiple teams, external vendors, submodules
15 SRE 4 years Grafana, Prometheus, Signoz, Wazuh
16 LLM 2 years OpenWebUI, Apple Silicon, QA/validation, RAG, data cleaning/prep
17 Programming 5 years J2ME,PHP,Ruby,TCL/TK,Java,C,C++
18 Automation 22 years Bash,YAML,TOML,PowerShell,Perl
19 Embedded development 5 years Raspberry pi, arduino, seeduino, Lego Mindstorms

135
non-cv/build.sh Normal file
View File

@ -0,0 +1,135 @@
#!/bin/bash
##########################################################
#Global variables
##########################################################
IntermediateOutputFile="./output/intermediate/CharlesNWybleShortResume.md"
FinalOutputFilePDF="/d/tsys/@ReachableCEO/resume.reachableceo.com/non-cv/CharlesNWybleShortResume.pdf"
#FinalOutputFileHTML="/d/tsys/@ReachableCEO/resume.reachableceo.com/non-cv/CharlesNWybleShortResume.pdf"
#FinalOutputFileDOC="/d/tsys/@ReachableCEO/resume.reachableceo.com/non-cv/CharlesNWybleShortResume.pdf"
cleanup()
{
rm ./output/intermediate/CharlesNWybleShortResume.md
}
# Combine markdown files into single input file for pandoc
echo "Combining markdown files..."
createMdContact()
{
#Pull in my contact info
cat "../common/Contact-Info.md" >> $IntermediateOutputFile
echo " " >> $IntermediateOutputFile
}
createMdSkills()
{
#Pull in my skills
echo "## Skills" >> "$IntermediateOutputFile"
#Table heading
echo "|Skill|Experience|Skill Details|" >> $IntermediateOutputFile
echo "|---|---|---|" >> $IntermediateOutputFile
#Table rows
IFS=$'\n\t'
for skill in \
$(cat ./Skills.csv); do
SKILL_NAME="$(echo $skill|awk -F '|' '{print $1}')"
SKILL_YEARS="$(echo $skill|awk -F '|' '{print $2}')"
SKILL_DETAIL="$(echo $skill|awk -F '|' '{print $3}')"
echo "|**$SKILL_NAME**|$SKILL_YEARS|$SKILL_DETAIL|" >> $IntermediateOutputFile
done
unset IFS
echo "\pagebreak" >> $IntermediateOutputFile
}
createMdProjects()
{
#Pull in my projects
## <p align="center"> My development environment </p>
echo "## Highlights from my 22 year of experience" >> "$IntermediateOutputFile"
echo
cat "./Projects.md" >> $IntermediateOutputFile
echo " " >> $IntermediateOutputFile
echo "\pagebreak" >> $IntermediateOutputFile
}
createMdWorkHistory()
{
#Pull in my work history
echo " " >> $IntermediateOutputFile
echo "## Employment History" >> $IntermediateOutputFile
echo " " >> $IntermediateOutputFile
echo "Comprehensive employment history available on my [Linkedin Profile](https://www.linkedin.com/in/charles-wyble-412007337)" >> $IntermediateOutputFile
echo " " >> $IntermediateOutputFile
IFS=$'\n\t'
for position in \
$(cat ../common/WorkHistory.csv); do
COMPANY="$(echo $position|awk -F ',' '{print $1}')"
TITLE="$(echo $position|awk -F ',' '{print $2}')"
DATEOFEMPLOY="$(echo $position|awk -F ',' '{print $3}')"
echo " " >> "$IntermediateOutputFile"
echo "**$COMPANY** | $TITLE | $DATEOFEMPLOY" >> $IntermediateOutputFile
echo " " >> "$IntermediateOutputFile"
done
unset IFS
echo "\pagebreak" >> $IntermediateOutputFile
}
generateFinalOutputFilePdf()
{
# Run pandoc to generate PDF into output dir
echo "Generating PDF..."
pandoc \
$IntermediateOutputFile \
--template eisvogel \
--metadata-file=./HumanOutput-NonCV.yml \
--from markdown \
--to=pdf \
--output $FinalOutputFilePDF
}
cleanup
createMdContact
createMdProjects
createMdWorkHistory
createMdSkills
generateFinalOutputFilePdf

View File

@ -0,0 +1,79 @@
Charles N Wyble
=====
Senior (**Staff level**) **System Engineer/SRE/Architect** with extensive Linux/Windows/Networking/Cyber security background and experience
[ [Github Profile](https://github.com/reachableceo) ] . [ [Linkedin Profile](https://www.linkedin.com/in/charles-wyble-412007337) ] . [ reachableceo@reachableceo.com ] . [ 818 280 7059 ] . [ Austin TX / Raleigh NC / Remote ]
## Highlights from my 22 year of experience
- Developed and implemented a process to switch thousands of desktops providing digital signage functionality from Fedora to Debian in a completely automated fashion using a custom initrd.
- Developed and implemented an internal private cloud orchestration and provisioning system for a hardware development engineering team that handled the entire provisioning lifecycle for physical and virtual systems.
- Developed and implemented standardized language and procedures and incident investigation automation for a large technical support organization with high turnover.
- Developed and implemented an automated order status and payment handling interactive voice response application using Angel.ccm with a backend web service returning Voice XML. This allows call center personnel to focus on revenue generating opportunities instead of administrative matters.
- Provided technician support to a team of electrical engineers building the power system for the radar of FrakenSAM in Ukraine. Handled high / low voltage wiring and plumbing and documentation of those systems.
- Provided root cause analysis , mitigation and remediation of security breaches by advanced persistent threat actors at high value targets.
- Project managed a successful brand new data center build from bare dirt to serving content in 86 days. Oversaw 8 billion dollars of capital deployment.
- Led and consulted tier 1 payment compliance industry (PCI) implementations for some of the worlds largest brands (including at a payment processor).
\pagebreak
## Employment History
Comprehensive employment history available on my [Linkedin Profile](https://www.linkedin.com/in/charles-wyble-412007337)
**CDK Global** | Senior System Engineer | July 2024 - October 2024
**Apple Computer** | Senior System Administrator | March 2024 - July 2024
**SHEIN.com** | Staff Site Reliability Engineer | December 2022 - August 2023
**3M** | Senior Site Reliability Engineer | March 2020 - November 2022
**TippingPoint** | Staff System/Network Architect | March 2012 - June 2019
**HostGator.com** | Automation and Escalation Engineer | March 2011 - May 2012
**RippleTV** | Systems Engineer | October 2008 - Jaunary 2010
**Walt Disney Internet Group** | Site Reliability Engineer | August 2006 - September 2007
**Electronic Clearing House** | Senior Linux System Administrator | April 2005 - July 2006
**GSI Commerce** | HPUX/Windows/Linux System Administrator | March 2002 - February 2005
**ReachableCEO Enterprises** | Freelancer | January 2001 - December 2024
\pagebreak
## Skills
|Skill|Experience|Skill Details|
|---|---|---|
|**Linux**|22 years|RHEL,Debian,Ubuntu,kickstart,PXE, LDAP,SSSD,RPM/Deb package creation, quotas,extended permissions, clustering,NFS,Samba|
|**Unix**|5 years|HPUX/Solaris|
|**Windows**|22 years|Server (2008 2016),Windows client automated deployment (7,8,10,11),Active Directory,Group Policy,WSUS,Certificate Services,AD DNS,AD DHCP,complex multiple forest and domain setups |
|**Free/Libre/Open Source software**|22 years|Apache,Postfix,Qmail,Dovecot,Courier IMAP,Nginx,Matamo,Discourse,Wordpress, Mautic,Dolibarr,Revive Ad Server,Firefly,Cloudron,Coolify,Gitea, HomeAssistant, Jenkins,Rundeck,N8N, LetsEncrypt,ACME,cfssl|
|**Databases**|22 years| MySQL,PostgreSQL, Dbeaver,PHPMyAdmin,PostGIS|
|**Cyber Security**|22 years|PCI Compliance (tier 1 implementations),OpenVAS<, Lynis, security hardening, audits, breach response and mitigation, patch and vulnerability management. AppArmor, SeLinux, Centrify, Tripwire, Integrit, OSSEC |
|**Networking**|22 years|Linux Virtual Server, HAProxy, Ubiquiti Unifi, Opnsense, Pfsense, HP, Cisco, Arista, Dell, DNS, DHCP, IPAM, PXE, IPS, IDS, GRE, IPSEC. Wireguard, OpenVPN, Nebula, Tailscale, RADIUS. Mostly layer2 data center/campus/access some WAN,firewall,layer3 |
|**Monitoring**|22 years|Uptime Kuma, Librenms, Zabbix, Zenoss, Nagios, ELK (Elasticsearch, Logstash, Kibana)|
|**Virtualization**|22 years|VmWare, Parallels,HyperV, KVM, Xen|
|**Cloud**|5 years|AWS, Azure, GCP, Kubernetes|
|**Containerization**|15 years|LXC, Docker,OpenVZ|
|**Configuration management/InfrastructureAsCode(IAC)**|22 years|FetchApply, Terraform/OpenTOfU,Ansible, AWX,Hashicorp Packer/Vault|
|**Ticket / incident / project management**| 22 years| Jira, ServiceNow,Redmine,RT|
|**Git**|15 years|Branching, merging, multiple teams, external vendors, submodules |
|**SRE**|4 years| Grafana, Prometheus, Signoz, Wazuh|
|**LLM**|2 years|OpenWebUI, Apple Silicon, QA/validation, RAG, data cleaning/prep|
|**Programming**|5 years|J2ME,PHP,Ruby,TCL/TK,Java,C,C++|
|**Automation**|22 years|Bash,YAML,TOML,PowerShell,Perl|
|**Embedded development**|5 years|Raspberry pi, arduino, seeduino, Lego Mindstorms|
\pagebreak

View File

@ -1,168 +0,0 @@
parserOptions:
ecmaVersion: latest
sourceType: module
env:
es6: true
jasmine: true
node: true
extends: eslint:recommended
rules:
accessor-pairs: error
array-bracket-spacing:
- error
- never
array-callback-return: error
block-spacing:
- error
- never
brace-style: error
comma-dangle: error
comma-spacing: error
comma-style: error
complexity:
- error
- 10
computed-property-spacing: error
consistent-return: error
consistent-this: error
constructor-super: error
curly: error
default-case: error
dot-notation: error
eol-last: error
eqeqeq: error
generator-star-spacing: error
global-require: off
guard-for-in: error
jsx-quotes: error
key-spacing: error
keyword-spacing: error
linebreak-style: error
lines-around-comment:
- error
-
allowBlockStart: true
allowObjectStart: true
allowArrayStart: true
max-statements-per-line: error
new-cap: error
new-parens: error
no-array-constructor: error
no-bitwise: error
no-caller: error
no-case-declarations: error
no-catch-shadow: error
no-class-assign: error
no-cond-assign: error
no-confusing-arrow: error
no-console: off
no-const-assign: error
no-constant-condition: error
no-continue: error
no-delete-var: error
no-dupe-args: error
no-dupe-class-members: error
no-dupe-keys: error
no-duplicate-case: error
no-duplicate-imports: error
no-empty: off
no-empty-character-class: error
no-empty-pattern: error
no-eq-null: error
no-eval: error
no-extend-native: error
no-extra-bind: error
no-extra-boolean-cast: error
no-extra-label: error
no-extra-semi: error
no-fallthrough: error
no-func-assign: error
no-implied-eval: error
no-inner-declarations: error
no-invalid-this: error
no-invalid-regexp: error
no-irregular-whitespace: error
no-iterator: error
no-label-var: error
no-labels: error
no-lone-blocks: error
no-lonely-if: error
no-loop-func: error
no-mixed-spaces-and-tabs: error
no-multi-spaces: error
no-multi-str: error
no-multiple-empty-lines:
- error
-
max: 2
no-native-reassign: error
no-negated-condition: error
no-nested-ternary: error
no-new: error
no-new-func: error
no-new-object: error
no-new-symbol: error
no-new-wrappers: error
no-obj-calls: error
no-octal: error
no-octal-escape: error
no-path-concat: error
no-plusplus: error
no-proto: error
no-redeclare: error
no-regex-spaces: error
no-restricted-globals: error
no-return-assign: error
no-script-url: error
no-self-assign: error
no-self-compare: error
no-sequences: error
no-shadow: error
no-shadow-restricted-names: error
no-spaced-func: error
no-sparse-arrays: error
no-this-before-super: error
no-throw-literal: error
no-trailing-spaces: error
no-undef: error
no-undef-init: error
no-unexpected-multiline: error
no-unmodified-loop-condition: error
no-unneeded-ternary: error
no-unreachable: error
no-unsafe-finally: error
no-unused-expressions: error
no-unused-labels: error
no-unused-vars: error
no-useless-call: error
no-useless-computed-key: error
no-useless-concat: error
no-useless-constructor: error
no-useless-escape: error
no-void: error
no-warning-comments: warn
no-whitespace-before-property: error
no-with: error
operator-assignment: error
padded-blocks:
- error
- never
prefer-const: error
quote-props:
- error
- as-needed
radix: error
require-yield: error
semi: error
semi-spacing: error
space-before-blocks: error
space-in-parens: error
space-infix-ops:
- error
-
int32Hint: false
space-unary-ops: error
spaced-comment: error
use-isnan: error
valid-typeof: error
yield-star-spacing: error

View File

@ -1,13 +0,0 @@
name: CI
on: [push]
jobs:
test:
name: Test
runs-on: ubuntu-latest
steps:
- name: Check out code
uses: actions/checkout@v1
- name: Run tests
run: ./run-tests
- name: Run against spec
run: ./run-spec

View File

@ -1,45 +0,0 @@
name: docker push
on: [push]
jobs:
push_to_registry:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@master
- name: Docker meta
if: startsWith(github.ref, 'refs/tags/')
id: docker_meta
uses: crazy-max/ghaction-docker-meta@v1
with:
images: ghcr.io/${{ github.repository }}
tag-match: v(.*)
- name: Set up QEMU
if: startsWith(github.ref, 'refs/tags/')
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v1
- name: Cache Docker layers
if: startsWith(github.ref, 'refs/tags/')
uses: actions/cache@v2
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Login to GitHub Container Registry
if: startsWith(github.ref, 'refs/tags/')
uses: docker/login-action@v1
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push
uses: docker/build-push-action@v2
if: startsWith(github.ref, 'refs/tags/')
with:
builder: ${{ steps.buildx.outputs.name }}
platforms: linux/amd64,linux/arm64
tags: ${{ steps.docker_meta.outputs.tags }}
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache
push: true

View File

@ -1,9 +0,0 @@
*.swp
diagnostic.partial
diagnostic.test
tests/*.diff
spec/
spec-runner/
node_modules/
package.json
package-lock.json

View File

@ -1,616 +0,0 @@
API / Function Documentation
============================
This documentation is generated automatically from the source of [mo] thanks to [tomdoc.sh].
`mo()`
------
Public: Template parser function. Writes templates to stdout.
* $0 - Name of the mo file, used for getting the help message.
* $@ - Filenames to parse.
Returns nothing.
`mo::debug()`
-------------
Internal: Show a debug message
* $1 - The debug message to show
Returns nothing.
`mo::debugShowState()`
----------------------
Internal: Show a debug message and internal state information
No arguments
Returns nothing.
`mo::error()`
-------------
Internal: Show an error message and exit
* $1 - The error message to show
* $2 - Error code
Returns nothing. Exits the program.
`mo::errorNear()`
-----------------
Internal: Show an error message with a snippet of context and exit
* $1 - The error message to show
* $2 - The starting point
* $3 - Error code
Returns nothing. Exits the program.
`mo::usage()`
-------------
Internal: Displays the usage for mo. Pulls this from the file that contained the `mo` function. Can only work when the right filename comes is the one argument, and that only happens when `mo` is called with `$0` set to this file.
* $1 - Filename that has the help message
Returns nothing.
`mo::content()`
---------------
Internal: Fetches the content to parse into MO_UNPARSED. Can be a list of partials for files or the content from stdin.
* $1 - Destination variable name
* $2-@ - File names (optional), read from stdin otherwise
Returns nothing.
`mo::contentFile()`
-------------------
Internal: Read a file into MO_UNPARSED.
* $1 - Destination variable name.
* $2 - Filename to load - if empty, defaults to /dev/stdin
Returns nothing.
`mo::indirect()`
----------------
Internal: Send a variable up to the parent of the caller of this function.
* $1 - Variable name
* $2 - Value
Examples
callFunc () {
local "$1" && mo::indirect "$1" "the value"
}
callFunc dest
echo "$dest" # writes "the value"
Returns nothing.
`mo::indirectArray()`
---------------------
Internal: Send an array as a variable up to caller of a function
* $1 - Variable name
* $2-@ - Array elements
Examples
callFunc () {
local myArray=(one two three)
local "$1" && mo::indirectArray "$1" "${myArray[@]}"
}
callFunc dest
echo "${dest[@]}" # writes "one two three"
Returns nothing.
`mo::trimUnparsed()`
--------------------
Internal: Trim leading characters from MO_UNPARSED
Returns nothing.
`mo::chomp()`
-------------
Internal: Remove whitespace and content after whitespace
* $1 - Name of the destination variable
* $2 - The string to chomp
Returns nothing.
`mo::parse()`
-------------
Public: Parses text, interpolates mustache tags. Utilizes the current value of MO_OPEN_DELIMITER, MO_CLOSE_DELIMITER, and MO_STANDALONE_CONTENT. Those three variables shouldn't be changed by user-defined functions.
* $1 - Destination variable name - where to store the finished content
* $2 - Content to parse
* $3 - Preserve standalone status/content - truthy if not empty. When set to a value, that becomes the standalone content value
Returns nothing.
`mo::parseInternal()`
---------------------
Internal: Parse MO_UNPARSED, writing content to MO_PARSED. Interpolates mustache tags.
No arguments
Returns nothing.
`mo::parseBlock()`
------------------
Internal: Handle parsing a block
* $1 - Invert condition ("true" or "false")
Returns nothing
`mo::parseBlockFunction()`
--------------------------
Internal: Handle parsing a block whose first argument is a function
* $1 - Invert condition ("true" or "false")
* $2-@ - The parsed tokens from inside the block tags
Returns nothing
`mo::parseBlockArray()`
-----------------------
Internal: Handle parsing a block whose first argument is an array
* $1 - Invert condition ("true" or "false")
* $2-@ - The parsed tokens from inside the block tags
Returns nothing
`mo::parseBlockValue()`
-----------------------
Internal: Handle parsing a block whose first argument is a value
* $1 - Invert condition ("true" or "false")
* $2-@ - The parsed tokens from inside the block tags
Returns nothing
`mo::parsePartial()`
--------------------
Internal: Handle parsing a partial
No arguments.
Indentation will be applied to the entire partial's contents before parsing. This indentation is based on the whitespace that ends the previously parsed content.
Returns nothing
`mo::parseComment()`
--------------------
Internal: Handle parsing a comment
No arguments.
Returns nothing
`mo::parseDelimiter()`
----------------------
Internal: Handle parsing the change of delimiters
No arguments.
Returns nothing
`mo::parseValue()`
------------------
Internal: Handle parsing value or function call
No arguments.
Returns nothing
`mo::isFunction()`
------------------
Internal: Determine if the given name is a defined function.
* $1 - Function name to check
Be extremely careful. Even if strict mode is enabled, it is not honored in newer versions of Bash. Any errors that crop up here will not be caught automatically.
Examples
moo () {
echo "This is a function"
}
if mo::isFunction moo; then
echo "moo is a defined function"
fi
Returns 0 if the name is a function, 1 otherwise.
`mo::isArray()`
---------------
Internal: Determine if a given environment variable exists and if it is an array.
* $1 - Name of environment variable
Be extremely careful. Even if strict mode is enabled, it is not honored in newer versions of Bash. Any errors that crop up here will not be caught automatically.
Examples
var=(abc)
if moIsArray var; then
echo "This is an array"
echo "Make sure you don't accidentally use \$var"
fi
Returns 0 if the name is not empty, 1 otherwise.
`mo::isArrayIndexValid()`
-------------------------
Internal: Determine if an array index exists.
* $1 - Variable name to check
* $2 - The index to check
Has to check if the variable is an array and if the index is valid for that type of array.
Returns true (0) if everything was ok, 1 if there's any condition that fails.
`mo::isVarSet()`
----------------
Internal: Determine if a variable is assigned, even if it is assigned an empty value.
* $1 - Variable name to check.
Can not use logic like this in case invalid variable names are passed. [[ "${!1-a}" == "${!1-b}" ]]
Returns true (0) if the variable is set, 1 if the variable is unset.
`mo::isTruthy()`
----------------
Internal: Determine if a value is considered truthy.
* $1 - The value to test
* $2 - Invert the value, either "true" or "false"
Returns true (0) if truthy, 1 otherwise.
`mo::evaluate()`
----------------
Internal: Convert token list to values
* $1 - Destination variable name
* $2-@ - Tokens to convert
Sample call:
mo::evaluate dest NAME username VALUE abc123 PAREN 2
Returns nothing.
`mo::evaluateListOfSingles()`
-----------------------------
Internal: Convert an argument list to individual values.
* $1 - Destination variable name
* $2-@ - A list of argument types and argument name/value.
This assumes each value is separate from the rest. In contrast, mo::evaluate will pass all arguments to a function if the first value is a function.
Sample call:
mo::evaluateListOfSingles dest NAME username VALUE abc123
Returns nothing.
`mo::evaluateSingle()`
----------------------
Internal: Evaluate a single argument
* $1 - Name of variable for result
* $2 - Type of argument, either NAME or VALUE
* $3 - Argument
Returns nothing
`mo::evaluateKey()`
-------------------
Internal: Return the value for @key based on current's name
* $1 - Name of variable for result
Returns nothing
`mo::evaluateVariable()`
------------------------
Internal: Handle a variable name
* $1 - Destination variable name
* $2 - Variable name
Returns nothing.
`mo::findVariableName()`
------------------------
Internal: Find the name of a variable to use
* $1 - Destination variable name, receives an array
* $2 - Variable name from the template
The array contains the following values
* [0] - Variable name
* [1] - Array index, or empty string
Example variables a="a"
b="b"
c=("c.0" "c.1")
d=([b]="d.b" [d]="d.d")
Given these inputs (function input, current value), produce these outputs a c => a
a c.0 => a
b d => d.b
b d.d => d.b
a d => d.a
a d.d => d.a
c.0 d => c.0
d.b d => d.b
'' c => c
'' c.0 => c.0
Returns nothing.
`mo::join()`
------------
Internal: Join / implode an array
* $1 - Variable name to receive the joined content
* $2 - Joiner
* $3-@ - Elements to join
Returns nothing.
`mo::evaluateFunction()`
------------------------
Internal: Call a function.
* $1 - Variable for output
* $2 - Content to pass
* $3 - Function to call
* $4-@ - Additional arguments as list of type, value/name
Returns nothing.
`mo::standaloneCheck()`
-----------------------
Internal: Check if a tag appears to have only whitespace before it and after it on a line. There must be a new line before and there must be a newline after or the end of a string
No arguments.
Returns 0 if this is a standalone tag, 1 otherwise.
`mo::standaloneProcess()`
-------------------------
Internal: Process content before and after a tag. Remove prior whitespace up to the previous newline. Remove following whitespace up to and including the next newline.
No arguments.
Returns nothing.
`mo::indentLines()`
-------------------
Internal: Apply indentation before any line that has content in MO_UNPARSED.
* $1 - Destination variable name.
* $2 - The indentation string.
* $3 - The content that needs the indentation string prepended on each line.
Returns nothing.
`mo::escape()`
--------------
Internal: Escape a value
* $1 - Destination variable name
* $2 - Value to escape
Returns nothing
`mo::getContentUntilClose()`
----------------------------
Internal: Get the content up to the end of the block by minimally parsing and balancing blocks. Returns the content before the end tag to the caller and removes the content + the end tag from MO_UNPARSED. This can change the delimiters, adjusting MO_OPEN_DELIMITER and MO_CLOSE_DELIMITER.
* $1 - Destination variable name
* $2 - Token string to match for a closing tag
Returns nothing.
`mo::tokensToString()`
----------------------
Internal: Convert a list of tokens to a string
* $1 - Destination variable for the string
* $2-$@ - Token list
Returns nothing.
`mo::getContentTrim()`
----------------------
Internal: Trims content from MO_UNPARSED, returns trimmed content.
* $1 - Destination variable
Returns nothing.
`mo::getContentComment()`
-------------------------
Get the content up to and including a close tag
* $1 - Destination variable
Returns nothing.
`mo::getContentDelimiter()`
---------------------------
Get the content up to and including a close tag. First two non-whitespace tokens become the new open and close tag.
* $1 - Destination variable
Returns nothing.
`mo::getContentWithinTag()`
---------------------------
Get the content up to and including a close tag. First two non-whitespace tokens become the new open and close tag.
* $1 - Destination variable, an array
* $2 - Terminator string
The array contents: [0] The raw content within the tag
[1] The parsed tokens as a single string
Returns nothing.
`mo::tokenizeTagContents()`
---------------------------
Internal: Parse MO_UNPARSED and retrieve the content within the tag delimiters. Converts everything into an array of string values.
* $1 - Destination variable for the array of contents.
* $2 - Stop processing when this content is found.
The list of tokens are in RPN form. The first item in the resulting array is the number of actual tokens (after combining command tokens) in the list.
Given: a 'bc' "de\"\n" (f {g 'h'}) Result: ([0]=4 [1]=NAME [2]=a [3]=VALUE [4]=bc [5]=VALUE [6]=$'de\"\n' [7]=NAME [8]=f [9]=NAME [10]=g [11]=VALUE [12]=h [13]=BRACE [14]=2 [15]=PAREN [16]=2
Returns nothing
`mo::tokenizeTagContentsName()`
-------------------------------
Internal: Get the contents of a variable name.
* $1 - Destination variable name for the token list (array of strings)
Returns nothing
`mo::tokenizeTagContentsDoubleQuote()`
--------------------------------------
Internal: Get the contents of a tag in double quotes. Parses the backslash sequences.
* $1 - Destination variable name for the token list (array of strings)
Returns nothing.
`mo::tokenizeTagContentsSingleQuote()`
--------------------------------------
Internal: Get the contents of a tag in single quotes. Only gets the raw value.
* $1 - Destination variable name for the token list (array of strings)
Returns nothing.
`MO_ORIGINAL_COMMAND`
---------------------
Save the original command's path for usage later
[mo]: ./mo
[tomdoc.sh]: https://github.com/tests-always-included/tomdoc.sh

View File

@ -1,7 +0,0 @@
FROM alpine
RUN apk add --no-cache bash
ADD mo /usr/local/bin/mo
RUN chmod +x /usr/local/bin/mo
ENTRYPOINT /usr/local/bin/mo

View File

@ -1,7 +0,0 @@
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Except as contained in this notice, the name(s) of the above copyright holders shall not be used in advertising or otherwise to promote the sale, use or other dealings in this Software without prior written authorization.
The end-user documentation included with the redistribution, if any, must include the following acknowledgment: "This product includes software developed by contributors", in the same place and form as other third-party acknowledgments. Alternately, this acknowledgment may appear in the software itself, in the same form and location as other such third-party acknowledgments.

View File

@ -1,340 +0,0 @@
Mo - Mustache Templates in Bash
===============================
[Mustache] templates are simple, logic-less templates. Because of their simplicity, they are able to be ported to many languages. The syntax is quite simple.
Hello, {{NAME}}.
I hope your {{TIME_PERIOD}} was fun.
The above file is [`demo/fun-trip.mo`](demo/fun-trip.mo). Let's try using this template some data from bash's environment. Go to your checked out copy of the project and run a command like this:
NAME=Tyler TIME_PERIOD=weekend ./mo demo/fun-trip.mo
Your result?
Hello, Tyler.
I hope your weekend was fun.
This bash version supports conditionals, functions (both as filters and as values), as well as indexed arrays (for iteration). You are able to leverage these additional features by adding more information into the environment. It is easiest to do this when you source `mo`. See the [demo scripts](demo/) for further examples.
Requirements
------------
* Bash 3.x (the aim is to make it work on Macs)
* The "coreutils" package (`basename` and `cat`)
* ... that's it. Why? Because bash **can**!
If you intend to develop this and run the official specs, you also need node.js.
Installation
------------
There are a few ways you can install this tool. How you install it depends on how you want to use it.
### Globally; For Everyone
You can install this file in `/usr/local/bin/` or `/usr/bin/` by simply downloading it, changing the permissions, then moving it to the right location. Double check that your system's PATH includes the destination folder, otherwise users may have a hard time starting the command.
# Download
curl -sSL https://raw.githubusercontent.com/tests-always-included/mo/master/mo -o mo
# Make executable
chmod +x mo
# Move to the right folder
sudo mv mo /usr/local/bin/
# Test
echo "works" | mo
### Locally; For Yourself
This is very similar to installing it globally but it does not require root privileges. It is very important that your PATH includes the destination folder otherwise it won't work. Some local folders that are typically used are `~/bin/` and `~/.local/bin/`.
# Download
curl -sSL https://raw.githubusercontent.com/tests-always-included/mo/master/mo -o mo
# Make executable
chmod +x mo
# Ensure destination folder exists
mkdir -p ~/.local/bin/
# Move to the right folder
mv mo ~/.local/bin/
# Test
echo "works" | mo
### As A Library; For A Tool
Bash scripts can source `mo` to include the functionality in their own routines. This usage typically would have `mo` saved to a `lib/` folder in an application and your other scripts would use `. lib/mo` to bring it into your project.
# Download
curl -sSL https://raw.githubusercontent.com/tests-always-included/mo/master/mo -o mo
# Move into your project folder
mv mo ~/projects/amazing-things/lib/
To allow it to work this way, you either should source the file (`. "lib/mo"`) or make it executable (`chmod +x lib/mo`) and run it from your scripts.
How to Use
----------
If you only plan using strings and numbers, nothing could be simpler. In your shell script you can choose to export the variables. The below script is [`demo/using-strings`](demo/using-strings).
#!/usr/bin/env bash
cd "$(dirname "$0")" # Go to the script's directory
export TEST="This is a test"
echo "Your message: {{TEST}}" | ../mo
The result? "Your message: This is a test".
Using arrays adds a slight level of complexity. *You must source `mo`.* Look at [`demo/using-arrays`](demo/using-arrays).
#!/usr/bin/env bash
cd "$(dirname "$0")" # Go to the script's directory
export ARRAY=( one two "three three three" four five )
. ../mo # This loads the "mo" function
cat << EOF | mo
Here are the items in the array:
{{#ARRAY}}
* {{.}}
{{/ARRAY}}
EOF
The result? You get a list of the five elements in the array. It is vital that you source `mo` and run the function when you want arrays to work because you can not execute a command and have arrays passed to that command's environment. Instead, we first source the file to load the function and then run the function directly.
There are more scripts available in the [demos directory](demo/) that could help illustrate how you would use this program.
There are additional features that the program supports. Try using `mo --help` to see what is available.
Please note that this command is written in Bash and pulls data from either the environment or (when using `--source`) from a text file that will be sourced and loaded into the environment, which means you will need to have Bash-style variables defined. Please see the examples in `demo/` for different ways you can use `mo`.
Enhancements
------------
In addition to many of the features built-in to Mustache, `mo` includes a number of unique features that make it a bit more powerful.
### Loop @key
`mo` implements Handlebar's `@key` references for outputting the key inside of a loop:
Env:
```bash
myarr=( foo bar )
# Bash v4+
declare -A myassoc
myassoc[hello]="mo"
myassoc[world]="is great"
```
Template:
```handlebars
{{#myarr}}
- {{@key}} {{.}}
{{/myarr}}
{{#myassoc}}
* {{@key}} {{.}}
{{/myassoc}}
```
Output:
```markdown
- 0 foo
- 1 bar
* hello mo
* world is great
```
### Helpers / Function Arguments
Function Arguments are not a part of the official Mustache implementation, and are more often associated with Handlebar's Helper functionality.
`mo` allows for passing strings to functions.
```handlebars
{{myfunc foo bar}}
```
For security reasons, these arguments are not immediately available to function calls without a flag.
#### with `--allow-function-arguments`
```bash
myfunc() {
# Outputs "foo, bar"
echo "$1, $2";
}
```
#### Using `$MO_FUNCTION_ARGS`
```bash
myfunc() {
# Outputs "foo, bar"
echo "${MO_FUNCTION_ARGS[0]}, ${MO_FUNCTION_ARGS[1]}";
}
```
### Triple Mustache, Parenthesis, and Quotes
Normally, triple mustache syntax, such as `{{{var}}}` will avoid HTML escaping of the variable. Because HTML escaping is not supported in `mo`, this is now used differently. Anything within braces will be looked up and the values will be concatenated together and the result will be treated as a value. Anything in parenthesis will be looked up, concatenated, and treated as a name. Also, anything in single quotes is passed as a value; double quoted things first are unescaped and then passed as a value.
```
# Example input
var=abc
user=admin
admin=Administrator
u=user
abc=([0]=zero [1]=one [2]=two)
```
| Mustache syntax | Resulting output | Notes |
|-----------------|------------------|-------|
| `{{var}}` | `abc` | Normal behavior |
| `{{var us}}` | `abcus` | Concatenation |
| `{{'var'}}` | `var` | Passing as a value |
| `{{"a\tb"}}` | `a b` | There was an escaped tab in the value |
| `{{u}}` | `user` | Normal behavior |
| `{{{u}}}` | `user` | Look up "$u", treat as the value `{{'user'}}` |
| `{{(u)}}` | `admin` | Look up "$u", treat as the name `{{user}}` |
| `{{var user}}` | `abcuser` | Concatenation |
| `{{(var '.1')}}` | `one` | Look up "$var", treat as "abc", then concatenate ".1" and look up `{{abc.1}}` |
In double-quoted strings, the following escape sequences are defined.
* `\"` - Quote
* `\b` - Bell
* `\e` - Escape (note that Bash typically uses $'\E' for the same thing)
* `\f` - Form feed
* `\n` - Newline
* `\r` - Carriage return
* `\t` - Tab
* `\v` - Vertical tab
* Anything else will skip the `\` and place the next character. However, this implementation is allowed to change in the future if a different escape character mapping becomes commonplace.
Environment Variables and Functions
-----------------------------------
There are several functions and variables used to process templates. `mo` reserves variables that start with `MO_` for variables exposing data or configuration, functions starting with `mo::`, and local variables starting with `mo[A-Z]`. You are welcome to use internal functions, though only ones that are marked as "Public" should not change their interface. Scripts may also read any of the variables.
Functions are all executed in a subshell, with another subshell for lambdas. Thus, your lambda can't affect the parsing of a template. There's more information about lambdas when talking about tests that fail.
* `MO_ALLOW_FUNCTION_ARGUMENTS` - When set to a non-empty value, this allows functions referenced in templates to receive additional options and arguments.
* `MO_CLOSE_DELIMITER` - The string used when closing a tag. Defaults to "}}". Used internally.
* `MO_CLOSE_DELIMITER_DEFAULT` - The default value of `MO_CLOSE_DELIMITER`. Used when resetting the close delimiter, such as when parsing a partial.
* `MO_CURRENT` - Variable name to use for ".".
* `MO_DEBUG` - When set to a non-empty value, additional debug information is written to stderr.
* `MO_FUNCTION_ARGS` - Arguments passed to the function.
* `MO_FAIL_ON_FILE` - If a filename from the command-line is missing or a partial does not exist, abort with an error.
* `MO_FAIL_ON_FUNCTION` - If a function returns a non-zero status code, abort with an error.
* `MO_FAIL_ON_UNSET` - When set to a non-empty value, expansion of an unset env variable will be aborted with an error.
* `MO_FALSE_IS_EMPTY` - When set to a non-empty value, the string "false" will be treated as an empty value for the purposes of conditionals.
* `MO_OPEN_DELIMITER` - The string used when opening a tag. Defaults to "{{". Used internally.
* `MO_OPEN_DELIMITER_DEFAULT` - The default value of MO_OPEN_DELIMITER. Used when resetting the open delimiter, such as when parsing a partial.
* `MO_ORIGINAL_COMMAND` - Used to find the `mo` program in order to generate a help message.
* `MO_PARSED` - Content that has made it through the template engine.
* `MO_STANDALONE_CONTENT` - The unparsed content that preceeded the current tag. When a standalone tag is encountered, this is checked to see if it only contains whitespace. If this and the whitespace condition after a tag is met, then this will be reset to $'\n'.
* `MO_UNPARSED` - Template content yet to make it through the parser.
Concessions
-----------
I admit that implementing everything in bash just doesn't make a lot of sense. For example, the following things just don't work because they don't really mesh with the "bash way".
Pull requests to solve the following issues would be helpful.
### Mustache Syntax
* Dotted names are supported but only for associative arrays (Bash 4). See [`demo/associative-arrays`](demo/associative-arrays) for an example.
* There's no "top level" object, so `echo '{{.}}' | ./mo` does not do anything useful. In other languages you can say the data for the template is a string and in `mo` the data is always the environment. Luckily this type of usage is rare and `{{.}}` works great when iterating over an array.
* [Parents](https://mustache.github.io/mustache.5.html#Parents), where a template can override chunks of a partial, are not supported.
* HTML encoding is not built into `mo`. `{{{var}}}`, `{{&var}}` and `{{var}}` all do the same thing. `echo '{{TEST}}' | TEST='<b>' mo` will give you "`<b>`" instead of "`&gt;b&lt;`".
### General Scripting Issues
* Using binary files as templates is simply not allowed.
* Bash does not support anything more complex than strings/numbers inside of associative arrays. I'm not able to add objects nor nested arrays to bash - it's just a shell after all!
* You must make sure the data is in the environment when `mo` runs. The easiest way to do that is to source `mo` in your shell script after setting up lots of other environment variables / functions.
Developing
----------
Check out the code and hack away. Please add tests to show off bugs before fixing them. New functionality should also be covered by a test.
First, make sure you install Node.js. After that, run `npm run install-tests` to get the dependencies and the repository of YAML tests. Run `npm run test` to run the JavaScript tests. There's over 100 of them, which is great. Not all of them will pass, but that's discussed later.
When submitting patches, make sure to run them past [ShellCheck] and ensure no problems are found. Also please use Bash 3 syntax if you are manipulating arrays.
### Porting and Backporting
In case of problems, setting MO_DEBUG to a non-empty value will give you LOTS of output.
MO_DEBUG=1 ./mo my-template
### Failed Specs
It is acceptable for some of the official spec tests to fail. The spec runner has specific exclusions and overrides to test similar functionality that avoid the following issues.
* Using `{{.}}` outside of a loop - In order to access any variable, you must use its name. In a loop, `{{.}}` will refer to the current value, but outside the loop you are unable to use this dot notation because there is no current value.
* Deeply nested data - Bash doesn't support complex data structure. Basically, just strings and arrays of strings.
* Interpolation; Multiple Calls: This fails because lambdas execute in a subshell so their output can be captured. If you want state to be preserved, you will need to write it outside of the current environment and load it again later.
* HTML Escaping - Since bash is not often executed in a web server context, it makes no sense to have the output escaped as HTML. Performing shell escaping of variables may be an option in the future if there's a demand.
* Lambdas - Function results are *not* automatically interpreted again. If you want to parse the results as Mustache content, use `mo::parse`. When they use `mo::parse`, it will use the current delimiters.
For lambdas, these examples may help.
```bash
# Retrieve content into a variable.
content=$(cat)
# Retrieve all content and do not trim newlines at the end.
content=$(cat; echo -n '.')
content=${content%.}
# Parse content using the current delimiters
mo::parse results "This is my content. Hello, {{username}}"
echo -n "$results"
# Parse content using the default delimiters
MO_OPEN_DELIMITER=$MO_OPEN_DELIMITER_DEFAULT
MO_CLOSE_DELIMITER=$MO_CLOSE_DELIMITER_DEFAULT
mo::parse results "This is my content. Hello, {{username}}"
echo -n "$results"
```
### Future Enhancements
There's a few places in the code marked with `TODO` to signify areas that could use improvement. Care to help? Keep in mind that this uses bash exclusively, so it might not look the prettiest.
License
-------
This program is licensed under an MIT license with an additional non-advertising clause. See [LICENSE.md](LICENSE.md) for the full text.
[Mustache]: https://mustache.github.io/
[ShellCheck]: https://github.com/koalaman/shellcheck

View File

@ -1,19 +0,0 @@
#!/usr/bin/env bash
cd "$(dirname "$0")" # Go to the script's directory
declare -A DATA
export DATA=([one]=111 [two]=222)
. ../mo
cat <<EOF | mo
Accessing data directly:
DATA: {{DATA}}
One: {{DATA.one}}
Two: {{DATA.two}}
Things in DATA:
{{#DATA}}
Item: {{.}}
{{/DATA}}
EOF

View File

@ -1,28 +0,0 @@
#!/usr/bin/env bash
#
# This embeds a template in the script without using strange `cat` syntax.
# shellcheck disable=SC1083 disable=SC1010 disable=SC1054 disable=SC1073 disable=SC1072 disable=SC1056 disable=SC1009
cd "$(dirname "$0")" # Go to the script's directory
export NAME="Tyler"
export VEHICLE="Ford Explorer"
export OVERDUE_LENGTH="2 months"
export OPTIONS=(
"Call a service representative at 1-800-000-0000 to discuss payment options"
"Return the vehicle immediately and pay a fine of 1 million dollars"
)
. ../mo
sed '0,/^# END/ d' "$(basename "$0")" | mo
exit
# END
Attention {{NAME}},
You need to pay for the {{VEHICLE}} you are leasing from us.
It has been {{OVERDUE_LENGTH}} since your last payment.
At this point you must do one of the following:
{{#OPTIONS}}
* {{.}}
{{/OPTIONS}}

View File

@ -1,3 +0,0 @@
Hello, {{NAME}}
I hope your {{TIME_PERIOD}} was fun.

View File

@ -1,31 +0,0 @@
#!/usr/bin/env bash
#
# This sources a simple script with the env. variables needed for the template.
cd "$(dirname "$0")" # Go to the script's directory
source ../mo
export NAME="Alex"
export ARRAY=( AAA BBB CCC )
# Include an external template
INCLUDE() {
# shellcheck disable=SC2031
cat "${MO_FUNCTION_ARGS[0]}"
}
# Print section title
TITLE() {
echo "+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+"
# shellcheck disable=SC2031
echo "${MO_FUNCTION_ARGS[0]}"
echo "+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+"
}
cat <<EOF | mo -u
{{TITLE 'Part 1'}}
{{INCLUDE 'function-args-part1'}}
{{TITLE 'Part 2'}}
{{INCLUDE 'function-args-part2'}}
EOF

View File

@ -1 +0,0 @@
Hello, my name is {{NAME}}.

View File

@ -1,3 +0,0 @@
{{#ARRAY}}
* {{.}}
{{/ARRAY}}

View File

@ -1,42 +0,0 @@
#!/usr/bin/env bash
cd "$(dirname "$0")" # Go to the script's directory
EVERY_REPO() {
# The block contents come in through standard input. Capture it here.
content=$(cat)
echo "# Starting EVERY_REPO"
# Get list of repos
for REPO in "${REPOS[@]}"; do
echo "## Looping one time for repo: $REPO"
# String replace REPO_ with the name
# This changes everything in the content block of the template.
# It rewrites {{__REPO__.name}} into {{resque.name}}, for instance.
# You can prefix your environment variables and do other things as well.
echo "$content" | sed "s/__REPO__/${REPO}/"
echo "## Looped one time for repo: $REPO"
done
echo "# Finished EVERY_REPO"
}
REPOS=(resque hub rip)
declare -A resque hub rip
resque=([name]=Resque [url]=http://example.com/resque)
hub=([name]=Hub [url]=http://example.com/hub)
rip=([name]=Rip [url]=http://example.com/rip)
. ../mo
cat <<EOF | mo
{{#EVERY_REPO}}
The repo is __REPO__
Name: {{__REPO__.name}}
URL: {{__REPO__.url}}
{{/EVERY_REPO}}
EOF

View File

@ -1,39 +0,0 @@
#!/usr/bin/env bash
cd "$(dirname "$0")" # Go to the script's directory
# Detect if this is the first item and write a comma if it is.
# Normally, I would track this using a variable, like so:
#
# COMMA_IF_NOT_FIRST_FLAG=false
# COMMA_IF_NOT_FIRST() {
# $COMMA_IF_NOT_FIRST || echo ","
# COMMA_IF_NOT_FIRST_FLAG=true
# }
#
# Since this function executes in a subshell, that approach will not work.
# Instead, we peek inside mo and see what is being processed. If the variable
# name in moParse() changes, this will need to get updated as well. An
# alternate variable that is usable is context, but that is in moLoop() and is
# two levels levels deep instead of just one.
COMMA_IF_NOT_FIRST() {
[[ "${moCurrent#*.}" != "0" ]] && echo ","
}
# Create an array that will be embedded into the JSON. If you are manipulating
# JSON, might I suggest you look at using jq? It's really good at processing
# JSON.
items=(
'{"position":"one","url":"1"}'
'{"position":"two","url":"2"}'
'{"position":"three","url":"3"}'
)
. ../mo
cat <<EOF | mo
{
{{#items}}
{{COMMA_IF_NOT_FIRST}}
{{.}}
{{/items}}
}
EOF

View File

@ -1,50 +0,0 @@
#!/usr/bin/env bash
# Example for how #29 can get implemented.
cd "$(dirname "$0")" # Go to the script's directory
foreach() {
# Trying to use unique names
local foreachSourceName foreachIterator foreachEvalString foreachContent
foreachContent=$(cat)
local x
x=("${@}")
if [[ "$2" != "as" && "$2" != "in" ]]; then
echo "Invalid foreach - bad format."
elif [[ "$(declare -p "$1")" != "declare -"[aA]* ]]; then
echo "$1 is not an array"
else
foreachSourceName="${1}[@]"
for foreachIterator in "${!foreachSourceName}"; do
foreachEvalString=$(declare -p "$foreachIterator")
foreachEvalString="declare -A $3=${foreachEvalString#*=}"
eval "$foreachEvalString"
echo "$foreachContent" | mo
done
fi
}
# The links are associative arrays
declare -A resque hub rip
resque=([name]=Resque [url]=http://example.com/resque)
hub=([name]=Hub [url]=http://example.com/hub)
rip=([name]=Rip [url]=http://example.com/rip)
# This is a list of the link arrays
links=(resque hub rip)
# Source mo in order to work with arrays
. ../mo
# Process the template
cat <<EOF | mo --allow-function-arguments
Here are your links:
{{#foreach 'links' 'as' 'link'}}
* [{{link.name}}]({{link.url}})
{{/foreach 'links' 'as' 'link'}}
EOF

View File

@ -1,28 +0,0 @@
#!/usr/bin/env bash
cd "$(dirname "$0")"/..
date-string() {
date
}
wrapper() {
echo -n "*** $(cat) ***"
}
export IP=127.0.0.1
export ALLOWED_HOSTS=( 192.168.0.1 192.168.0.2 192.168.0.3 )
. ./mo # Keep in mind this script is executing in the parent directory
cat <<EOF | mo
# {{#wrapper}}OH SO IMPORTANT{{/wrapper}}
# This file automatically generated at {{date-string}}
home_ip={{IP}}
# ALLOWED HOSTS
{{#ALLOWED_HOSTS}}allowed_host={{.}}
{{/ALLOWED_HOSTS}}{{^ALLOWED_HOSTS}}# No allowed hosts{{/ALLOWED_HOSTS}}
# DENIED HOSTS
{{#DENIED_HOSTS}}denied_host={{.}}
{{/DENIED_HOSTS}}{{^DENIED_HOSTS}}# No denied hosts{{/DENIED_HOSTS}}
EOF

View File

@ -1,15 +0,0 @@
#!/usr/bin/env bash
export data=$'line 1\nline 2'
cat <<EOF | ../mo
Here is a partial without an indent:
{{> partial}}
And here's the same partial with a 4-space indent:
{{> partial}}
:-)
EOF

View File

@ -1 +0,0 @@
{{data}}

View File

@ -1,13 +0,0 @@
#!/usr/bin/env bash
#
# This sources a simple script with the env. variables needed for the template.
cd "$(dirname "$0")" # Go to the script's directory
cat <<EOF | ../mo --source=sourcing.vars
Hello, my name is {{NAME}}.
And this is ARRAY's conntents:
{{#ARRAY}}
* {{.}}
{{/ARRAY}}
EOF

View File

@ -1,2 +0,0 @@
export NAME="Alex"
export ARRAY=( AAA BBB CCC )

View File

@ -1,10 +0,0 @@
#!/usr/bin/env bash
cd "$(dirname "$0")" # Go to the script's directory
export ARRAY=( one two "three three three" four five )
. ../mo
cat << EOF | mo
Here are the items in the array:
{{#ARRAY}}
* {{.}}
{{/ARRAY}}
EOF

View File

@ -1,7 +0,0 @@
#!/usr/bin/env bash
#
# This example does not source `mo` and is intentionally restricted to
# variables that are not arrays.
cd "$(dirname "$0")" # Go to the script's directory
export TEST="This is a test"
echo "Your message: {{TEST}}" | ../mo

View File

@ -1,12 +0,0 @@
#!/usr/bin/env bash
cd "$(dirname "$0")" # Go to the script's directory
export OPEN="{{"
export CLOSE="}}"
cat <<'EOF' | mo
You can use environment variables to write output that has double braces.
{{OPEN}}sampleTag{{CLOSE}}
EOF

View File

@ -1,20 +0,0 @@
#!/usr/bin/env bash
#
# This requires tomdoc.sh to be in your PATH.
# https://github.com/tests-always-included/tomdoc.sh
cd "${0%/*}" || exit 1
cat <<'EOF'
API / Function Documentation
============================
This documentation is generated automatically from the source of [mo] thanks to [tomdoc.sh].
EOF
sed 's/# shellcheck.*//' mo | tomdoc.sh -m
cat <<'EOF'
[mo]: ./mo
[tomdoc.sh]: https://github.com/tests-always-included/tomdoc.sh
EOF

File diff suppressed because it is too large Load Diff

View File

@ -1,22 +0,0 @@
#!/usr/bin/env bash
# Install or update the specs
if [[ ! -d spec ]]; then
git clone https://github.com/mustache/spec.git spec
else
(
cd spec
git pull
)
fi
if [[ "$BASH_VERSION" == 3.* ]]; then
echo "WARNING! Specs assume you are using a version of Bash with associative arrays!"
fi
# Actually run the specs
node run-spec.js spec/specs/*.json
if [[ "$BASH_VERSION" == 3.* ]]; then
echo "Some tests may have failed because they assume Bash supports associative arays"
fi

View File

@ -1,512 +0,0 @@
#!/usr/bin/env node
const exec = require("child_process").exec;
const fsPromises = require("fs").promises;
// Skip or override portions of tests. The goal is to still have as much
// coverage as possible, but skip things that Bash does not support.
//
// To skip a test, define a "skip" property and explain why the test is
// skipped.
//
// To override any test property, just define that property. It replaces the
// original property, not augmenting it.
const testOverrides = {
"Comments -> Variable Name Collision": {
// Can't use variables with exclamation points easily
data: {
comment: 4
}
},
"Interpolation -> Dotted Names - Arbitrary Depth": {
skip: "Not able to use more than one level of depth"
},
"Interpolation -> Dotted Names - Broken Chain Resolution": {
data: {
a: {
b: "wrong"
},
name: "Jim"
},
template: '"{{a.name}}" == ""'
},
"Interpolation -> Dotted Names - Initial Resolution": {
data: {
a: {
name: "Phil"
},
name: "Wrong"
},
template: "\"{{#a}}{{name}}{{/a}}\" == \"Phil\""
},
"Interpolation -> Implicit Iterators - Ampersand": {
skip: "HTML escaping is not supported"
},
"Interpolation -> Implicit Iterators - Basic Interpolation": {
skip: "Can not use {{.}} outside of a loop. Need to use a variable name."
},
"Interpolation -> Implicit Iterators - Basic Integer Interpolation": {
skip: "Can not use {{.}} outside of a loop. Need to use a variable name."
},
"Interpolation -> Implicit Iterators - Triple Mustache": {
skip: "Can not use {{.}} outside of a loop. Need to use a variable name."
},
"Interpolation -> HTML Escaping": {
skip: "HTML escaping is not supported"
},
"Interpolation -> Implicit Iterators - HTML Escaping": {
skip: "HTML escaping is not supported"
},
"Inverted -> Dotted Names - Falsey": {
data: {
a: {
b: ""
}
},
template: '"{{^a.b}}Not Here{{/a.b}}" == "Not Here"'
},
"Inverted -> Dotted Names - Truthy": {
data: {
a: {
b: "1"
}
},
template: '"{{^a.b}}Not Here{{/a.b}}" == ""'
},
"Lambdas -> Escaping": {
skip: "HTML escaping is not supported"
},
"Lambdas -> Interpolation - Alternate Delimiters": {
skip: "There is no difference between a lamba used as a value and a lambda used as a block. Both will parse using the current delimiters."
},
"Lambdas -> Inverted Section": {
// This one passed mostly by accident. Correcting so the test still
// tests what is was designed to illustrate.
data: {
static: "static",
lambda: {
__tag__: 'code',
bash: 'false'
}
}
},
"Lambdas -> Interpolation": {
data: {
lambda: {
__tag__: 'code',
bash: 'echo -n "world"'
}
}
},
"Lambdas -> Interpolation - Expansion": {
data: {
lambda: {
__tag__: 'code',
bash: 'mo::parse result "{{planet}}"; echo -n "$result"'
},
planet: 'world'
}
},
"Lambdas -> Interpolation - Multiple Calls": {
skip: "Calls are not cached, but they run in isolated environments, so saving a global variable does not work."
},
"Lambdas -> Section": {
data: {
lambda: {
__tag__: 'code',
bash: 'if [[ "$(cat)" == "{{x}}" ]]; then echo -n yes; else echo -n no; fi'
},
x: "Error!"
}
},
"Lambdas -> Section - Alternate Delimiters": {
data: {
lambda: {
__tag__: 'code',
bash: 'local content=$(cat); mo::parse content "$content{{planet}} => |planet|$content"; echo -n "$content"'
},
planet: 'Earth'
}
},
"Lambdas -> Section - Expansion": {
data: {
lambda: {
__tag__: 'code',
bash: 'local content=$(cat); mo::parse content "$content{{planet}}$content"; echo -n "$content"'
},
planet: "Earth"
}
},
"Lambdas -> Section - Multiple Calls": {
data: {
lambda: {
__tag__: 'code',
bash: 'echo -n "__$(cat)__"'
}
}
},
"Partials -> Recursion": {
skip: "Complex objects are not supported and context is reset to the global level, so the recursion will loop forever"
},
"Sections -> Deeply Nested Contexts": {
skip: "Nested objects are not supported"
},
"Sections -> Dotted Names - Broken Chains": {
// Complex objects are not supported
template: `"{{#a.b}}Here{{/a.b}}" == ""`
},
"Sections -> Dotted Names - Falsey": {
// Complex objects are not supported
data: { a: { b: false } },
template: `"{{#a.b}}Here{{/a.b}}" == ""`
},
"Sections -> Dotted Names - Truthy": {
// Complex objects are not supported
data: { a: { b: true } },
template: `"{{#a.b}}Here{{/a.b}}" == "Here"`
},
"Sections -> Implicit Iterator - Array": {
skip: "Nested arrays are not supported"
},
"Sections -> List": {
// Arrays of objects are not supported
data: { list: [1, 2, 3] },
template: `"{{#list}}{{.}}{{/list}}"`
},
"Sections -> List Context": {
skip: "Deeply nested objects are not supported"
},
"Sections -> List Contexts": {
skip: "Deeply nested objects are not supported"
}
};
function specFileToName(file) {
return file
.replace(/.*\//, "")
.replace(".json", "")
.replace("~", "")
.replace(/(^|-)[a-z]/g, function (match) {
return match.toUpperCase();
});
}
function processArraySequentially(array, callback) {
function processCopy() {
if (arrayCopy.length) {
const item = arrayCopy.shift();
return Promise.resolve(item)
.then(callback)
.then((singleResult) => {
result.push(singleResult);
return processCopy();
});
} else {
return Promise.resolve(result);
}
}
const result = [];
const arrayCopy = array.slice();
return processCopy();
}
function debug(...args) {
if (process.env.DEBUG) {
console.debug(...args);
}
}
function makeShellString(value) {
if (typeof value === "boolean") {
return value ? '"true"' : '""';
}
if (typeof value === "string") {
// Newlines are tricky
return value
.split(/\n/)
.map(function (chunk) {
return JSON.stringify(chunk);
})
.join('"\n"');
}
if (typeof value === "number") {
return value;
}
return "ERR_CONVERTING";
}
function addToEnvironmentArray(name, value) {
const result = ["("];
value.forEach(function (subValue) {
result.push(makeShellString(subValue));
});
result.push(")");
return name + "=" + result.join(" ");
}
function addToEnvironmentObjectConvertedToAssociativeArray(name, value) {
const values = [];
for (const [k, v] of Object.entries(value)) {
if (typeof v === "object") {
if (v) {
// An object - abort
return `# ${name}.${k} is an object that can not be converted to an associative array`;
}
// null
values.push(`[${k}]=`);
} else {
values.push(`[${k}]=${makeShellString(v)}`);
}
}
return `declare -A ${name}\n${name}=(${values.join(" ")})`;
}
function addToEnvironmentObject(name, value) {
if (!value) {
// null
return `#${name} is null`;
}
if (value.__tag__ === "code") {
return `${name}() { ${value.bash || 'echo "NO BASH VERSION OF CODE"'}; }`;
}
return addToEnvironmentObjectConvertedToAssociativeArray(name, value);
}
function addToEnvironment(name, value) {
if (Array.isArray(value)) {
return addToEnvironmentArray(name, value);
}
if (typeof value === "object") {
return addToEnvironmentObject(name, value);
}
return `${name}=${makeShellString(value)}`;
}
function buildScript(test) {
const script = ["#!/usr/bin/env bash"];
Object.keys(test.data).forEach(function (name) {
script.push(addToEnvironment(name, test.data[name]));
});
script.push(". ./mo");
script.push("mo spec-runner/spec-template");
script.push("");
return script.join("\n");
}
function writePartials(test) {
return processArraySequentially(
Object.keys(test.partials),
(partialName) => {
debug("Writing partial:", partialName);
return fsPromises.writeFile(
"spec-runner/" + partialName,
test.partials[partialName]
);
}
);
}
function setupEnvironment(test) {
return cleanup()
.then(() => fsPromises.mkdir("spec-runner/"))
.then(() =>
fsPromises.writeFile("spec-runner/spec-script", test.script)
)
.then(() =>
fsPromises.writeFile("spec-runner/spec-template", test.template)
)
.then(() => writePartials(test));
}
function executeScript(test) {
return new Promise((resolve) => {
exec(
"bash spec-runner/spec-script 2>&1",
{
timeout: 2000
},
(err, stdout) => {
if (err) {
test.scriptError = err.toString();
}
test.output = stdout;
resolve();
}
);
});
}
function cleanup() {
return fsPromises.rm("spec-runner/", { force: true, recursive: true });
}
function detectFailure(test) {
if (test.scriptError) {
return true;
}
if (test.output !== test.expected) {
return true;
}
return false;
}
function showFailureDetails(test) {
console.log(`FAILURE: ${test.fullName}`);
console.log("");
console.log(test.desc);
console.log("");
console.log(JSON.stringify(test, null, 4));
}
function applyTestOverrides(test) {
const overrides = testOverrides[test.fullName];
const originals = {};
if (!overrides) {
return;
}
for (const [key, value] of Object.entries(overrides)) {
originals[key] = test[key];
test[key] = value;
}
test.overridesApplied = true;
test.valuesBeforeOverride = originals;
}
function runTest(testSet, test) {
test.partials = test.partials || {};
test.fullName = `${testSet.name} -> ${test.name}`;
applyTestOverrides(test);
test.script = buildScript(test);
if (test.skip) {
debug("Skipping test:", test.fullName, `(${test.skip})`);
return Promise.resolve();
}
debug("Running test:", test.fullName);
return setupEnvironment(test)
.then(() => executeScript(test))
.then(cleanup)
.then(() => {
test.isFailure = detectFailure(test);
if (test.isFailure) {
showFailureDetails(test);
} else {
debug('Test pass:', test.fullName);
}
});
}
function processSpecFile(filename) {
debug("Read spec file:", filename);
return fsPromises.readFile(filename, "utf8").then((fileContents) => {
const testSet = JSON.parse(fileContents);
testSet.name = specFileToName(filename);
return processArraySequentially(testSet.tests, (test) =>
runTest(testSet, test)
).then(() => {
testSet.pass = 0;
testSet.fail = 0;
testSet.skip = 0;
testSet.passOverride = 0;
for (const test of testSet.tests) {
if (test.isFailure) {
testSet.fail += 1;
} else if (test.skip) {
testSet.skip += 1;
} else {
testSet.pass += 1;
if (test.overridesApplied) {
testSet.passOverride += 1;
}
}
}
console.log(
`### ${testSet.name} Results = ${testSet.pass} passed (with ${testSet.passOverride} overridden), ${testSet.fail} failed, ${testSet.skip} skipped`
);
return testSet;
});
});
}
// 0 = node, 1 = script, 2 = file
if (process.argv.length < 3) {
console.log("Specify one or more JSON spec files on the command line");
process.exit();
}
processArraySequentially(process.argv.slice(2), processSpecFile).then(
(result) => {
console.log("=========================================");
console.log("");
console.log("Failed Test Summary");
console.log("");
let pass = 0,
fail = 0,
skip = 0,
total = 0,
passOverride = 0;
for (const testSet of result) {
pass += testSet.pass;
fail += testSet.fail;
skip += testSet.skip;
total += testSet.tests.length;
passOverride += testSet.passOverride;
console.log(
`* ${testSet.name}: ${testSet.tests.length} total, ${testSet.pass} pass (with ${passOverride} overridden), ${testSet.fail} fail, ${testSet.skip} skip`
);
for (const test of testSet.tests) {
if (test.isFailure) {
console.log(` * Failure: ${test.name}`);
}
}
}
console.log("");
console.log(
`Final result: ${total} total, ${pass} pass (with ${passOverride} overridden), ${fail} fail, ${skip} skip`
);
if (fail) {
process.exit(1);
}
},
(err) => {
console.error(err);
console.error("FAILURE RUNNING SCRIPT");
console.error("Testing artifacts are left in script-runner/ folder");
}
);

View File

@ -1,162 +0,0 @@
#!/usr/bin/env bash
#
# Run one or more tests.
#
# Command-line usage to run all tests.
#
# ./run-tests
#
# To run only one test, run "tests/test-name".
#
# Usage within a test as a template. Source run-tests to get functions, export
# any necessary variables, then call runTest.
#
# #!/usr/bin/env bash
# cd "${0%/*}" || exit 1
# . ../run-tests
#
# export template="This is a template"
# export expected="This is a template"
# runTest
#
# When used within the test, you control various aspects with environment
# variables or functions.
#
# - The content passed into mo is either the variable "$template" or the output
# of the function called template.
# - The expected result is either "$expected" or the function called expected.
# - The expected return code is "$returnCode" and defaults to 0.
# - The arguments to pass to mo is the array "${arguments[@]}" and defaults to ().
#
# When $MO_DEBUG is set to a non-empty value, the test does not run, but mo is
# simply executed directly. This allows for calling mo in the same manner as
# the test but does not buffer output nor expect the output to match the
# expected.
#
# When $MO_DEBUG_TEST is set to a non-empty value, the expected and actual
# results are shown using "declare -p" to provide an easier time seeing the
# differences, especially with whitespace.
testCase() {
echo "Input: $1"
echo "Expected: $2"
}
indirect() {
unset -v "$1"
printf -v "$1" '%s' "$2"
}
getValue() {
local name temp len hardSpace
name=$2
hardSpace=" "
if declare -f "$name" &> /dev/null; then
temp=$("$name"; echo -n "$hardSpace")
len=$((${#temp} - 1))
if [[ "${temp:$len}" == "$hardSpace" ]]; then
temp=${temp:0:$len}
fi
else
temp=${!name}
fi
local "$1" && indirect "$1" "$temp"
}
runTest() (
local testTemplate testExpected testActual hardSpace len testReturnCode testFail
hardSpace=" "
. ../mo
getValue testTemplate template
getValue testExpected expected
if [[ -n "${MO_DEBUG:-}" ]]; then
echo -n "$testTemplate" | mo ${arguments[@]+"${arguments[@]}"} 2>&1
return $?
fi
testActual=$(echo -n "$testTemplate" | mo ${arguments[@]+"${arguments[@]}"} 2>&1; echo -n "$hardSpace$?")
testReturnCode=${testActual##*$hardSpace}
testActual=${testActual%$hardSpace*}
testFail=false
if [[ "$testActual" != "$testExpected" ]]; then
echo "Failure"
echo "Expected:"
echo "$testExpected"
echo "Actual:"
echo "$testActual"
if [[ -n "${MO_DEBUG_TEST-}" ]]; then
declare -p testExpected
# Align the two declare outputs
echo -n " "
declare -p testActual
fi
testFail=true
fi
if [[ "$testReturnCode" != "$returnCode" ]]; then
echo "Expected return code $returnCode, but got $testReturnCode"
testFail=true
fi
if [[ "$testFail" == "true" ]]; then
return 1
fi
return 0
)
runTestFile() (
local file=$1
echo "Test: $file"
"$file"
)
runTests() (
PASS=0
FAIL=0
if [[ $# -gt 0 ]]; then
for TEST in "$@"; do
runTestFile "$TEST" && PASS=$((PASS + 1)) || FAIL=$((FAIL + 1))
done
else
cd "${0%/*}"
for TEST in tests/*; do
if [[ -f "$TEST" ]]; then
runTestFile "$TEST" && PASS=$((PASS + 1)) || FAIL=$((FAIL + 1))
fi
done
fi
echo ""
echo "Pass: $PASS"
echo "Fail: $FAIL"
if [[ $FAIL -gt 0 ]]; then
exit 1
fi
)
# Clear test related variables
template="Template not defined"
expected="Expected not defined"
returnCode=0
arguments=()
# If sourced, load functions.
# If executed, perform the actions as expected.
if [[ "$0" == "${BASH_SOURCE[0]}" ]] || [[ -z "${BASH_SOURCE[0]}" ]]; then
runTests ${@+"${@}"}
fi

View File

@ -1,9 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export thing="Works"
export template="{{&thing}}"
export expected="Works"
runTest

View File

@ -1,21 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export repo=( "resque" "hub" "rip" )
template() {
cat <<EOF
{{#repo}}
<b>{{@key}} - {{.}}</b>
{{/repo}}
EOF
}
expected() {
cat <<EOF
<b>0 - resque</b>
<b>1 - hub</b>
<b>2 - rip</b>
EOF
}
runTest

View File

@ -1,25 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
declare -A repo
# The order of the array elements can be shuffled depending on the version of
# Bash. Keeping this to a minimal set and alphabetized seems to help.
repo[hub]="Hub"
repo[rip]="Rip"
export repo
template() {
cat <<EOF
{{#repo}}
<b>{{@key}} - {{.}}</b>
{{/repo}}
EOF
}
expected() {
cat <<EOF
<b>hub - Hub</b>
<b>rip - Rip</b>
EOF
}
runTest

View File

@ -1,8 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export template="Wor{{!comment}}ks"
export expected="Works"
runTest

View File

@ -1,15 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
template() {
cat <<EOF
<h1>Today{{! ignore me
and this can
run through multiple
lines}}.</h1>
EOF
}
export expected=$'<h1>Today.</h1>\n'
runTest

View File

@ -1,8 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export template="Wor{{! comment }}ks"
export expected="Works"
runTest

View File

@ -1,10 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export thing="Wor"
export thing2="ks"
export template="{{thing thing2}}"
export expected="Works"
runTest

View File

@ -1,9 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export thing="Works"
export template="{{=| |=}}|thing|"
export expected="Works"
runTest

View File

@ -1,10 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export arguments=(--fail-on-file -- --help)
export returnCode=1
export template=""
export expected=$'ERROR: No such file: --help\n'
runTest

View File

@ -1,8 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export template='{{"Works"}}'
export expected="Works"
runTest

View File

@ -1,20 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
unset __NO_SUCH_VAR
export POPULATED="words"
export EMPTY=""
export arguments=(--fail-not-set)
export returnCode=1
template() {
cat <<EOF
Populated: {{POPULATED}};
Empty: {{EMPTY}};
Unset: {{__NO_SUCH_VAR}};
EOF
}
export expected=$'ERROR: Environment variable not set: __NO_SUCH_VAR\n'
runTest

View File

@ -1,13 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
failFunction() {
false
}
export arguments=(--fail-on-function)
export returnCode=1
export template="Fail on function? {{failFunction}}"
export expected=$'ERROR: Function failed with status code 1: "failFunction"\n'
runTest

View File

@ -1,18 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export USER=j.doe
export ADMIN=false
export arguments=(--false)
template() {
cat <<EOF
The user {{USER}} exists.
{{#ADMIN}}
WRONG - should not be an admin.
{{/ADMIN}}
EOF
}
export expected=$'The user j.doe exists.\n'
runTest

View File

@ -1,18 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export USER=j.doe
export ADMIN=false
MO_FALSE_IS_EMPTY=yeppers
template() {
cat <<EOF
The user {{USER}} exists.
{{#ADMIN}}
WRONG - should not be an admin.
{{/ADMIN}}
EOF
}
export expected=$'The user j.doe exists.\n'
runTest

View File

@ -1,16 +0,0 @@
#!/usr/bin/env bash
cd "${0%/*}" || exit 1
. ../run-tests
export person=""
template() {
cat <<EOF
Shown.
{{#person}}
Never shown!
{{/person}}
EOF
}
export expected=$'Shown.\n'
runTest

View File

@ -1,2 +0,0 @@
first line
second line

View File

@ -1 +0,0 @@
{{multilineData}}

View File

@ -1 +0,0 @@
<strong>{{.}}</strong>

View File

@ -1,2 +0,0 @@
export A=from1
export B=from1

View File

@ -1,2 +0,0 @@
export B=from2
export C=from2

View File

@ -1,5 +0,0 @@
export VAR=value
export ARR=(1 2 3)
declare -A ASSOC_ARR
# Can not export associative arrays, otherwise they turn into indexed arrays
ASSOC_ARR=([a]=AAA [b]=BBB)

View File

@ -1,3 +0,0 @@
|
{{content}}
|

Some files were not shown because too many files have changed in this diff Show More